• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Frostbite Technical Director on why Frostbite never came to Wii U

NFS is the definition of a game that does not seem to be CPU bound - it was ported mostly intact to the PlayStation Vita, which is souped up smartphone hardware. I don't know if I would use that as an example to illustrate why there are no problems with the performance of the WiiU CPU.

Ugh, can we please stop this? Every time a Wii U title performs badly it's the hardware blamed. Every time it performs better "Well, the game clearly wasn't CPU bound" or some other reason is given to excuse the performance.
 
Ugh, can we please stop this? Every time a Wii U title performs badly it's the hardware blamed. Every time it performs better "Well, the game clearly wasn't CPU bound" or some other reason is given to excuse the performance.

As opposed to the cherry-picking wherein every game that is on par or better on the Wii U is heralded as a shining beacon of what every game can be while every subpar port is dismissed as having been ported by lazy developers?
 

dysonEA

Member
As opposed to the cherry-picking wherein every game that is on par or better on the Wii U is heralded as a shining beacon of what every game can be while every subpar port is dismissed as having been ported by lazy developers?

Developers can be very lazy when it comes to ports. I believe they usually are with wii u ports. Games need to be optimized to perform better on wii u. Devs just don't take the time needed to do a proper job.
 

RedSwirl

Junior Member
Did Nintendo really not expect the next generation PlayStation and Xbox to be at a certain level in terms of hardware?

I don't think Nintendo has been caught blind here. I think they made a deliberate decision on the Wii U's hardware power because they don't really care about $60 million hardcore cinematic 18-35 male demographic experiences appearing on the system. They probably don't care about the kinds of experiences Frostbite 3 would provide. I think the problem is that they expect the EA's of the world to try to see their way of doing things.
 

SMT

this show is not Breaking Bad why is it not Breaking Bad? it should be Breaking Bad dammit Breaking Bad
whatareyoutalkingaboutwhoareyoutalkingto.gif

A quick and easy way to identify a poster is to look at the name/avatar found to the left of their post. I'm just going to assume that your first post was some bizarre attempt at sarcasm and that you don't actually believe that the Wii-U has a Pentium in it.

And I'm just going to assume that your knock on my humor was a bizarre attempt at humor.
 

Gestault

Member
Ugh, can we please stop this? Every time a Wii U title performs badly it's the hardware blamed. Every time it performs better "Well, the game clearly wasn't CPU bound" or some other reason is given to excuse the performance.

With respect, what's a single other example of a Wii U game performing better that people said was related to CPU performance? Just one.
 

Schnozberry

Member
With respect, what's a single other example of a Wii U game performing better that people said was related to CPU performance? Just one.

Injustice runs better on Wii U compared to PS3. Whether or not that's related to CPU performance I can't tell you. But it's noticeable in the cut scenes when more than a couple of characters are on screen.
 

Kai Dracon

Writing a dinosaur space opera symphony
Did Nintendo really not expect the next generation PlayStation and Xbox to be at a certain level in terms of hardware?

I don't think Nintendo has been caught blind here. I think they made a deliberate decision on the Wii U's hardware power because they don't really care about $60 million hardcore cinematic 18-35 male demographic experiences appearing on the system. They probably don't care about the kinds of experiences Frostbite 3 would provide. I think the problem is that they expect the EA's of the world to try to see their way of doing things.

Curiously, when there was initially bluster about how Wii U would finally make it so easy for 3rd parties to port all their AAA games to a Nintendo platform, was that jazz mostly hype from Reggie at NOA? Or did Iwata ever play that angle up in Japan?

Because I always suspected such talk was mostly more of Reggie's bravado. More realistically, back in Japan I would assume they were viewing the hardware the way Nintendo has always viewed their hardware... always. Designed to the tastes and particulars of their own engineers and lead software developers first, and screw what anyone else is doing.

This kind of thing isn't new. Back in the day, there was quite a controversy over the Super Nintendo's "frustrating" hardware featuring a slow CPU that made ports of Genesis action games run poorly. And limited the scope of arcade style games built from the ground up for the SNES. Every Nintendo console has been built to suit Nintendo first. (The Wii was just the most extreme. The underlying philosophy has never changed.)
 
Did Nintendo really not expect the next generation PlayStation and Xbox to be at a certain level in terms of hardware?

I don't think Nintendo has been caught blind here. I think they made a deliberate decision on the Wii U's hardware power because they don't really care about $60 million hardcore cinematic 18-35 male demographic experiences appearing on the system. They probably don't care about the kinds of experiences Frostbite 3 would provide. I think the problem is that they expect the EA's of the world to try to see their way of doing things.

Yes I think they expect some titles to miss the console and they are probably ok with it given their decisions, but at the same time they are hoping some hardcore games are released on the Wii U. I would think given the 3rd party list of games to be released that they are probably at least satisfied by it.

What I know they want for sure is that 3rd partys build gaming experiences from the ground up to take advantage of the Wii U's capabilities, just like with the Wii. I know this takes money but Zombi U proves it does not take a AAA budget to do it.
 

SMT

this show is not Breaking Bad why is it not Breaking Bad? it should be Breaking Bad dammit Breaking Bad
Injustice runs better on Wii U compared to PS3. Whether or not that's related to CPU performance I can't tell you. But it's noticeable in the cut scenes when more than a couple of characters are on screen.

PS3's GPU is weaker than WiiU, that could be it.
 
As opposed to the cherry-picking wherein every game that is on par or better on the Wii U is heralded as a shining beacon of what every game can be while every subpar port is dismissed as having been ported by lazy developers?

But that makes more sense. Every other console is judged by it's best looking games not its worst and yet the Wii U is an exception. If PS3 was judged based on every multiplatform title it had that was crappy vs the 360...
 
But that makes more sense. Every other console is judged by it's best looking games not its worst and yet the Wii U is an exception. If PS3 was judged based on every multiplatform title it had that was crappy vs the 360...

Well, a few things spring to mind. One, the Wii U had seven years on the 360. Two, the ratio in the launch window of ports suggesting a technological advantage to ports suggesting deficiencies swayed overwhelmingly in favor of the latter. Three, the games in the win column don't really demonstrate a marked improvement. And by that, I think a lot of the concerns might have been mitigated if there was a first party showcase that could be demonstrated, instead of "here's a couple of ports with marginal improvements and a couple more that are on par."

As is, I stand by that accusation bias there. At this stage in the game, it just seems kind of silly to only want to focus on the few good examples of Wii U ports and deride all evidence to the contrary as not counting while also deriding people who don't agree with you for not wanting to count the good games for whatever reason. At best, the fair thing to do is just conclude that the jury is still out on exactly how the Wii U stacks up to the competition.
 
I don't know why some GAFfers are quick to throw out cliche responses like "salt", "fanboy", or "edgy d00d". EA and Nintendo announced an "unprecedented partnership". Now no EA games are coming to the platform. Obviously something happened.

Now, if you want to say it's all Nintendo's fault the relationship fell apart, by all means, say that. But are people actually disputing that the "unprecedented partnership" collapsed?
 
A

A More Normal Bird

Unconfirmed Member
And I'm just going to assume that your knock on my humor was a bizarre attempt at humor.
Pretty good comeback.

Oh, you must be my friend Dave, hello.

I don't know why some GAFfers are quick to throw out cliche responses like "salt", "fanboy", or "edgy d00d". EA and Nintendo announced an "unprecedented partnership". Now no EA games are coming to the platform. Obviously something happened.

Now, if you want to say it's all Nintendo's fault the relationship fell apart, by all means, say that. But are people actually disputing that the "unprecedented partnership" collapsed?

I think for many those early comments about things like CPU power stuck. It's also easier to just assume that it's simply a case of capability because that's simpler than unorthodox architecture + average power + strained relationship + low install base = not worth the effort. The question is how many of those factors would you have to remove before EA decided it would be worth putting one of their A teams on getting the engine up and running? I mean, Crytek did it, but they have outside licensees to consider.
 

Schnozberry

Member
Well, a few things spring to mind. One, the Wii U had seven years on the 360. Two, the ratio in the launch window of ports suggesting a technological advantage to ports suggesting deficiencies swayed overwhelmingly in favor of the latter. Three, the games in the win column don't really demonstrate a marked improvement. And by that, I think a lot of the concerns might have been mitigated if there was a first party showcase that could be demonstrated, instead of "here's a couple of ports with marginal improvements and a couple more that are on par."

As is, I stand by that accusation bias there. At this stage in the game, it just seems kind of silly to only want to focus on the few good examples of Wii U ports and deride all evidence to the contrary as not counting while also deriding people who don't agree with you for not wanting to count the good games for whatever reason. At best, the fair thing to do is just conclude that the jury is still out on exactly how the Wii U stacks up to the competition.

I think you're misattributing the shortfalls of the launch games to the hardware rather than the tools. If you read wsippel's comments from this thread regarding tools, I think it makes more sense. Couple that with the commentary from Criterion and Miyamoto's own comments about not having the right tools in developers hands until right before launch, and I think the performance problems come into focus.

The Wii U hardware is limited in comparison to the PS4 and Next Xbox, but I think once we get some games built from the ground up with the system in mind, we'll get a clearer view of what it is capable of. Then first year of any console is always a broken mirror view of it's real potential. The 360 was called Xbox 1.5 basically right up until Gears of War released.
 

SMT

this show is not Breaking Bad why is it not Breaking Bad? it should be Breaking Bad dammit Breaking Bad
Pretty good comeback.

Oh, you must be my friend Dave, hello.

I'm speechless, you figured me out.
The internet is nor longer a safe place, Tom.
Be wary, there are breaks in the 'unprecedented partnership' among us.
http://www.youtube.com/watch?v=VwzRLgJorYQ


It's okay everyone, I still haven't seen Nintendo's next-gen console, I hope it's good.
I'm hoping for an unveil at E3.
 
A

A More Normal Bird

Unconfirmed Member
I'm speechless, you figured me out.
The internet is nor longer a safe place, Tom.
Be wary, there are breaks in the 'unprecedented partnership' among us.
http://www.youtube.com/watch?v=VwzRLgJorYQ


It's okay everyone, I still haven't seen Nintendo's next-gen console, I hope it's good.
I'm hoping for an unveil at E3.
Haha. If you're still coming over for the barbecue this weekend I'll show you the new Game Informer, it's got a feature on Ubisoft's exclusive for the next Nintendo console. It's an FPS set in Japan and the graphics look as good as anything on 360 (if not better, there's no jaggies at all) and they say the AI is going to be better than FEAR. The way they use the controller is going to be revolutionary, expect big things at E3.
 

MDX

Member
jmmvTNrEFMURe.jpg





Huh? Now this meme is bugging me.

Isn't the whole point of this pic to show the Wii U stuck outside in the rain, looking into the house, all forlorn and abandoned? I thought it was supposed to be like a poor dog, in which some evil owner left outside in the rain.

What is the GAF answer on this issue? Is it outside looking in, or is it indoors, looking out?

The WiiU is inside looking out.
As in, it wants to go out and play but its raining.
And if it had games, it would care about being outside.
 
NFS Vita has a significant lower car count during SP, though.

The Vita version runs on 3 A9 Arm cores running at like 800 Mhz. Being able to beat that while plugged into a wall is not exactly a big win for the WiiU.

Ugh, can we please stop this? Every time a Wii U title performs badly it's the hardware blamed. Every time it performs better "Well, the game clearly wasn't CPU bound" or some other reason is given to excuse the performance.

You say that like you think those reactions are in conflict with each other, when that's exactly what we should expect from a system we know has a more advanced GPU, but somewhat limited CPU power compared to the competition.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Haha, well I did say that, but that was tangential to my main point. I will own it, however. How am I incorrect?
I'll explain and I use an example you yourself provided.

IMHO if you think about the things you can scale in a game, they pretty much come in two classes: things that affect the game design directly (number of enemies is an obvious one) and things that do not (higher resolution textures, more particles, better shaders, anti-aliasing, etc.)
GPUs have a rather clear-cut generations advancements. The featureset in a given gen is not the same as the featurset in the previous gen - usually small, but consequential changes are made in every new gen. Those changes either allow for new more efficient implementations of prior art, or make feasible things which previously were not. When you develop for multiple GPUs, you essentially have the following two options:

(a) You develop for the least common denominator featureset (i.e. qualitatively), and then scale things that are easy to scale quantitatively - things like those you mention - asset res, RT res, particles, better/more AA. That, though, would automatically mean that your lead platform is effectively equated to the least common denominator qualitatively. And this is what normally happens with PC games - they pick a least common denominator as a featureset, and then crank up the quantitative dials for the top setups (or dial down the same dials from the recommended config - it's effectively the same, as long as the process is kept relatively linear).

(b) You develop for a given contemporary featureset, and then you effectively rewrite your backend for that downport to that several-gens-old GPU. That normally involves a different shading model, RT setup, cutting off of some effects altogether, but can go as deep as the very fundamentals of the asset production pipeline. Of course you're right to point out that some of those things are of lesser gameplay significance, and that's why they could be removed altogether. But when was the last time you played a modern game in dx8 mode? I mean, what modern game's gameplay would be entirely broken by a visual downgrade to dx8? How about dx7? dx6? I guess you're caching my point. Devs don't just target a myriad of featuresets down to the bottom of the installbase, as very soon the effort becomes exponential.

Apparently the latter case above is the more demanding in terms of effort. But that is often the price of the competitive advantage on the PC scene. Otherise you risk looking outdated in a highy tech-competitive scene.

Now, let's switch to the CPU side of things. CPUs seldom have shifts in the featuresets - a new ISA extension here along with a new register file there, but that is most of the time covered by things like compilers. At the end of the day CPUs vary in sheer performance, and performance alone. Compared to GPUs which vary in both performance _and_ featuresets. Bottomline being, it could be a major endeavor to port something to GPU X which was not developed with GPU X's featureset as a lead platform in mind.

As about how CPU workloads could scale, I'll use the example you provided, but that in a sec.

I think I made it clear in my last post - that overall, the Xenon has more computational power than the Espresso. The Espresso may be more efficient per watt or per clock, but 3.2GHz of brute force seems to carry the day from what people seem to say.
What is that computational power that Xenon has outside of its SIMD engine? It surely is not IPC and not cache sizes.

Even in this case, where the DLC is basically finished, there are still certification and QA costs involved. It's possible that the game's sales on WiiU do not justify even this minimal cost to bring the DLC over. (I have no idea how many copies of NFS were sold on the WiiU. Maybe if they sold millions of copies, then they really are holding it back out of spite!)
We are talking here of a DD platform full of 10-20 buck indie titles, with the occasional sub 10 sale, and sub 5 buck indie DLC content. How costly do you expect NFS' DLC certification/QA to be to make that DLC financially unfeasible?

NFS is the definition of a game that does not seem to be CPU bound - it was ported mostly intact to the PlayStation Vita, which is souped up smartphone hardware. I don't know if I would use that as an example to illustrate why there are no problems with the performance of the WiiU CPU.
And here is the moment we return to how CPU workload can scale with power. I'm particularly glad you think Most Wanted on the Vita is intact, as it shows how efficient the scaling can be.
 

Durante

Member
Ugh, can we please stop this? Every time a Wii U title performs badly it's the hardware blamed. Every time it performs better "Well, the game clearly wasn't CPU bound" or some other reason is given to excuse the performance.
Different games have different performance characteristics. This is neither new nor surprising.
 
With this kind of user and developer perception I'd love to be a fly on the all in a Nintendo board room right now. As a company do you just try to power through and lower projections all across the board?. Do you send the WiiU prematurely to its grave and launch something more powerful that's backwards compatible?
 
I think you're misattributing the shortfalls of the launch games to the hardware rather than the tools. If you read wsippel's comments from this thread regarding tools, I think it makes more sense. Couple that with the commentary from Criterion and Miyamoto's own comments about not having the right tools in developers hands until right before launch, and I think the performance problems come into focus.

The Wii U hardware is limited in comparison to the PS4 and Next Xbox, but I think once we get some games built from the ground up with the system in mind, we'll get a clearer view of what it is capable of. Then first year of any console is always a broken mirror view of it's real potential. The 360 was called Xbox 1.5 basically right up until Gears of War released.

Bad early tools could very well be the reason for the dubious, early ports. However, that is neither here nor there. The point to me is simply that the jury is still out on how the Wii U will ultimately stack up. The launch may have painted a worse picture in general -- as launch software can do -- of what the system is capable of, but to me, the story just seemed to be that the Wii U clearly wasn't going to be much of a step up from previous hardware that existed 7 years ago. At launch, it literally wasn't a step up at all, and often lost the tech comparisons save for Trine 2.

Now, it's probably unfair to conclude from there that it's weaker, but there still isn't much to latch on to in terms of this thing having lots of untapped potential. We have Need for Speed and what, that 20 second demo of X that looked pretty good? As is, it's not really all that unfair to point out that current software isn't really indicating that the Wii U is a real beast. But until that Gen 2 software starts to come out and assuage people's concerns, I still stand by the assertion that it was odd to play the "don't make excuses when a game doesn't fit your narrative" when making excuses when games don't fit your narrative.
 

SMT

this show is not Breaking Bad why is it not Breaking Bad? it should be Breaking Bad dammit Breaking Bad
Haha. If you're still coming over for the barbecue this weekend I'll show you the new Game Informer, it's got a feature on Ubisoft's exclusive for the next Nintendo console. It's an FPS set in Japan and the graphics look as good as anything on 360 (if not better, there's no jaggies at all) and they say the AI is going to be better than FEAR. The way they use the controller is going to be revolutionary, expect big things at E3.

Will it be open-world? Because Jane loves open-world games, she won't leave the house, so giving her a sandbox game based in Japan would give us a 'vacation' of sorts.
 

Argyle

Member
I'll explain and I use an example you yourself provided.


GPUs have a rather clear-cut generations advancements. The featureset in a given gen is not the same as the featurset in the previous gen - usually small, but consequential changes are made in every new gen. Those changes either allow for new more efficient implementations of prior art, or make feasible things which previously were not. When you develop for multiple GPUs, you essentially have the following two options:

(a) You develop for the least common denominator featureset (i.e. qualitatively), and then scale things that are easy to scale quantitatively - things like those you mention - asset res, RT res, particles, better/more AA. That, though, would automatically mean that your lead platform is effectively equated to the least common denominator qualitatively. And this is what normally happens with PC games - they pick a least common denominator as a featureset, and then crank up the quantitative dials for the top setups (or dial down the same dials from the recommended config - it's effectively the same, as long as the process is kept relatively linear).

(b) You develop for a given contemporary featureset, and then you effectively rewrite your backend for that downport to that several-gens-old GPU. That normally involves a different shading model, RT setup, cutting off of some effects altogether, but can go as deep as the very fundamentals of the asset production pipeline. Of course you're right to point out that some of those things are of lesser gameplay significance, and that's why they could be removed altogether. But when was the last time you played a modern game in dx8 mode? I mean, what modern game's gameplay would be entirely broken by a visual downgrade to dx8? How about dx7? dx6? I guess you're caching my point. Devs don't just target a myriad of featuresets down to the bottom of the installbase, as very soon the effort becomes exponential.

Apparently the latter case above is the more demanding in terms of effort. But that is often the price of the competitive advantage on the PC scene. Otherise you risk looking outdated in a highy tech-competitive scene.

Now, let's switch to the CPU side of things. CPUs seldom have shifts in the featuresets - a new ISA extension here along with a new register file there, but that is most of the time covered by things like compilers. At the end of the day CPUs vary in sheer performance, and performance alone. Compared to GPUs which vary in both performance _and_ featuresets. Bottomline being, it could be a major endeavor to port something to GPU X which was not developed with GPU X's featureset as a lead platform in mind.

As about how CPU workloads could scale, I'll use the example you provided, but that in a sec.

And here is the moment we return to how CPU workload can scale with power. I'm particularly glad you think Most Wanted on the Vita is intact, as it shows how efficient the scaling can be.

Well, in all honesty, I don't disagree with what you have said, but it has nothing whatsoever to do with my point!

My point is as simple as this:

Gameplay code is not scalable, not if you want the same game afterwards.

Games have to run both simulation and rendering, and it seems to me that the WiiU does not have as many resources as the other consoles as far as simulation.

I would argue that they did a good job with NFS on both the Vita and the WiiU. But on both platforms, sacrifices had to be made. (I seem to recall saying "mostly intact" :) On Vita the framerate is halved, the multiplayer player count has been halved, and the amount of traffic in single player is reduced. On the WiiU the online player count has been reduced as well. These are the tough decisions that I alluded to making earlier. Are these reduced, watered down versions of the games worth playing? In this case, perhaps there's still enough of the flavor there for it to be worthwhile. But on Vita, is the difficulty level changed because there are fewer random cars on the road to collide with? On WiiU, are the multiplayer modes just as fun with 2 fewer opponents? These are fundamental changes to the game, different than changing the size of a framebuffer, or using higher resolution models from the PC version to render.

Let's flip it around: If there is no problem and the Espresso is awesome - what reason would Criterion have to reduce the player counts at all?

Let me try again with the original point: Say you are working at DICE on Battlefield 4, and you discover that the only way you are going to hit 30fps on WiiU is to reduce the player counts on the 360/PS3 versions by 25% (in this case, 24 player to 18 players, the same percentage player count reduction done on WiiU NFS).

Do you:

1. Lower the player count on all three platforms, so that you only have a single set of maps with reduced area to playtest and balance. Keep in mind players will remember that BF3 had a higher player count and will wonder what happened.
2. Lower the player count only on WiiU, but then you will have additional work to balance the game and make sure the experience is still fun with the reduced player count. Is it worth it given the WiiU installed base?
3. WiiU LOL - kill the WiiU version and devote the resources freed up to making the other versions better

What is that computational power that Xenon has outside of its SIMD engine? It surely is not IPC and not cache sizes.

Was it not clear enough from my previous post? It's sheer brute force.

Let's say processing power = clock speed * average work done per clock.

You'll notice that I never disagreed that the Xenon is much less efficient (less work done per clock) compared to the Espresso. But at the same time, it's clocked nearly 3x higher. So basically, in the end, it seems that this is the case:

(3.2Ghz * lower Xenon work per clock) > (1.25Ghz * better Espresso work per clock)

Simple as that.

We are talking here of a DD platform full of 10-20 buck indie titles, with the occasional sub 10 sale, and sub 5 buck indie DLC content. How costly do you expect NFS' DLC certification/QA to be to make that DLC financially unfeasible?

So, as I pointed out, I haven't a clue as to how many copies of of NFS were sold on the WiiU.

How many of those are likely to buy the DLC? It's not going to be 100%.

Don't forget to take into account Nintendo's cut of the sales as well as any submission fees, also don't forget the opportunity cost of the people needed to shepherd it through QA - the producers making the submission to Nintendo, the QA team needed to make sure that it is stable, all of those people could be working on something else, but you are going to have them do this for a few weeks instead.

Maybe that equation doesn't add up for EA. Maybe they might even turn a small profit in the end but the opportunity cost of those people working on something that could make more money causes the math to swing back in the other direction.

Do you really think it is a grand conspiracy? I think Alex Ward was on Twitter asking people to buy the WiiU version if they wanted DLC support. Do you think he is lying and that there is no way they would ever bring the DLC over, even if every single WiiU owner picked up a copy of NFS tomorrow?
 

Nilaul

Member
Fail flow has something intersting to say about the cpu

FailOverflow said:
Yes, a 1.6GHz quad-core Cortex-A9 with NEON from ~2010 beats a 1.2GHz tri-core PowerPC 750 with paired singles from ~1997 or 2001 (depending on whether you count the PS or not). The PPC750 is a nice core and has lasted long (and beats crap like the Cell PPU and the 360's cores clock-per-clock on integer workloads), but sorry, contemporary mobile architectures have caught up, and the lack of modern SIMD is significant. Performance varies by workload, but I'm willing to bet that they're similar at integer workloads and the Cortex-A9 definitely has more SIMD oomph thanks to NEON.

The guy who persumingly cracked the WiiU.

I guess we can all stop blaming the CPU.
 

TheD

The Detective
Fail flow has something intersting to say about the cpu



The guy who persumingly cracked the WiiU.

I guess we can all stop blaming the CPU.

Or you can not only highlight select parts of a quote (and also ignore that he said clock for clock).

The part after your highlighting is the most interesting (importance of good SIMD performance).
 
Fail flow has something intersting to say about the cpu



The guy who persumingly cracked the WiiU.

I guess we can all stop blaming the CPU.

Why do you think a guy saying the WiiU CPU is worse than a phone CPU from 18 months ago vindicates Nintendo? That's more proof that the CPU is underpowered and is the likely stumbling block for multi-platform development.
 

Woo-Fu

Banned
It's an FPS set in Japan and the graphics look as good as anything on 360 (if not better, there's no jaggies at all) and they say the AI is going to be better than FEAR.

Lol, why do people use FEAR as an AI benchmark? I kept hearing that before playing FEAR and for some reason people keep saying it after playing FEAR.

You can be standing 10 feet from an enemy, with him looking right at you and throw a giant mine on the ground. The enemy will then step on the mine and kill itself trying to get to you.

If that is the high-water mark for AI we've got a problem. I have to assume this is just a case of people being brainwashed by marketing box quotes. The last time I was impressed by AI was the original half-life. That shows how little that aspect of videogaming has advanced---and perhaps how jaded I've become.
 

prag16

Banned
Well, in all honesty, I don't disagree with what you have said, but it has nothing whatsoever to do with my point!

My point is as simple as this:

Gameplay code is not scalable, not if you want the same game afterwards.

Games have to run both simulation and rendering, and it seems to me that the WiiU does not have as many resources as the other consoles as far as simulation.

I would argue that they did a good job with NFS on both the Vita and the WiiU. But on both platforms, sacrifices had to be made. (I seem to recall saying "mostly intact" :) On Vita the framerate is halved, the multiplayer player count has been halved, and the amount of traffic in single player is reduced. On the WiiU the online player count has been reduced as well. These are the tough decisions that I alluded to making earlier. Are these reduced, watered down versions of the games worth playing? In this case, perhaps there's still enough of the flavor there for it to be worthwhile. But on Vita, is the difficulty level changed because there are fewer random cars on the road to collide with? On WiiU, are the multiplayer modes just as fun with 2 fewer opponents? These are fundamental changes to the game, different than changing the size of a framebuffer, or using higher resolution models from the PC version to render.

Let's flip it around: If there is no problem and the Espresso is awesome - what reason would Criterion have to reduce the player counts at all?

Let me try again with the original point: Say you are working at DICE on Battlefield 4, and you discover that the only way you are going to hit 30fps on WiiU is to reduce the player counts on the 360/PS3 versions by 25% (in this case, 24 player to 18 players, the same percentage player count reduction done on WiiU NFS).

Do you:

1. Lower the player count on all three platforms, so that you only have a single set of maps with reduced area to playtest and balance. Keep in mind players will remember that BF3 had a higher player count and will wonder what happened.
2. Lower the player count only on WiiU, but then you will have additional work to balance the game and make sure the experience is still fun with the reduced player count. Is it worth it given the WiiU installed base?
3. WiiU LOL - kill the WiiU version and devote the resources freed up to making the other versions better



Was it not clear enough from my previous post? It's sheer brute force.

Let's say processing power = clock speed * average work done per clock.

You'll notice that I never disagreed that the Xenon is much less efficient (less work done per clock) compared to the Espresso. But at the same time, it's clocked nearly 3x higher. So basically, in the end, it seems that this is the case:

(3.2Ghz * lower Xenon work per clock) > (1.25Ghz * better Espresso work per clock)

Simple as that.



So, as I pointed out, I haven't a clue as to how many copies of of NFS were sold on the WiiU.

How many of those are likely to buy the DLC? It's not going to be 100%.

Don't forget to take into account Nintendo's cut of the sales as well as any submission fees, also don't forget the opportunity cost of the people needed to shepherd it through QA - the producers making the submission to Nintendo, the QA team needed to make sure that it is stable, all of those people could be working on something else, but you are going to have them do this for a few weeks instead.

Maybe that equation doesn't add up for EA. Maybe they might even turn a small profit in the end but the opportunity cost of those people working on something that could make more money causes the math to swing back in the other direction.

Do you really think it is a grand conspiracy? I think Alex Ward was on Twitter asking people to buy the WiiU version if they wanted DLC support. Do you think he is lying and that there is no way they would ever bring the DLC over, even if every single WiiU owner picked up a copy of NFS tomorrow?

Your CPU analysis is terrible. I don't think you read anything blu said. You grossly oversimplify things and then carry that over to your NFS/Battlefield diatribe which is all conjecture. Criterion never said why they reduced the online player count.

SIMD or floating point stuff, Cell/Xenon beat espresso (and quite possibly the 8-core jaguars as well). Integer or general purpose code, espresso should pull ahead.

Saying that an unoptimized frostbite running on an early Wii U devkit is a good indicator of capability is questionable regardless. Hell, PS4 would probably choke on FB3 left totally unoptimized. It's an engine that can at times bring a much more powerful i7 to its knees after all.
 
Your CPU analysis is terrible. I don't think you read anything blu said. You grossly oversimplify things and then carry that over to your NFS/Battlefield diatribe which is all conjecture. Criterion never said why they reduced the online player count.

SIMD or floating point stuff, Cell/Xenon beat espresso (and quite possibly the 8-core jaguars as well). Integer or general purpose code, espresso should pull ahead.

Saying that an unoptimized frostbite running on an early Wii U devkit is a good indicator of capability is questionable regardless. Hell, PS4 would probably choke on FB3 left totally unoptimized. It's an engine that can at times bring a much more powerful i7 to its knees after all.

I would love to hear a single alternative explanation for this. Have they reduced it out of spite? Did they think less players made it more fun? Did they do it because the install base is smaller?

Delusional fanboyism is an incredible defect..
 
A

A More Normal Bird

Unconfirmed Member
Will it be open-world? Because Jane loves open-world games, she won't leave the house, so giving her a sandbox game based in Japan would give us a 'vacation' of sorts.
Unfortunately not, but there's always the much vaunted Tokyo Street experience available on the EShop. IGN said its graphics were even more impressive than any of the other demos!

Lol, why do people use FEAR as an AI benchmark? I kept hearing that before playing FEAR and for some reason people keep saying it after playing FEAR.

You can be standing 10 feet from an enemy, with him looking right at you and throw a giant mine on the ground. The enemy will then step on the mine and kill itself trying to get to you.

If that is the high-water mark for AI we've got a problem. I have to assume this is just a case of people being brainwashed by marketing box quotes. The last time I was impressed by AI was the original half-life. That shows how little that aspect of videogaming has advanced---and perhaps how jaded I've become.

AI in general may not have improved too much but I think the promise of PC level AI (as a baseline, who knows how much better it could be) in an FPS with melee combat coupled with Nintendo's revolutionary new controller make this new and unreleased FPS set in Japan by Ubisoft available exclusively on Nintendo's next-gen console one to look out for
sound familiar?
 

Schnozberry

Member
I would love to hear a single alternative explanation for this. Have they reduced it out of spite? Did they think less players made it more fun? Did they do it because the install base is smaller?

Delusional fanboyism is an incredible defect..

Well, the game has no problem rendering more than 6 cars in single player mode, so I was assuming it was related to how the net code performed on Wii U, or online infrastructure issues.
 
Well, the game has no problem rendering more than 6 cars in single player mode, so I was assuming it was related to how the net code performed on Wii U, or online infrastructure issues.

How the netcode worked on the WiiU? Seriously? Really?

I'm almost speechless. And sad for you..

The number of AI characters and multiplayer characters on screen cause a remarkably different CPU and GPU workload; anyone who ever gamed on PCs knows this. Just because it performs at X fps with Y number of AI characters doesn't mean it will perform at X fps with Y number of multiplayer characters.
 

StevieP

Banned
How the netcode worked on the WiiU? Seriously? Really?

I'm almost speechless. And sad for you..

The number of AI characters and multiplayer characters on screen cause a remarkably different CPU and GPU workload; anyone who ever gamed on PCs knows this. Just because it performs at X fps with Y number of AI characters doesn't mean it will perform at X fps with Y number of multiplayer characters.

Before your crucification continues, note that the Fifa now-gone wii u team had some issues with Nintendo's online tools.
 
Activision had to turn down Call of Duty's lighting effects and make other changes to get it running on Wii U.

Frostbite...they simply said the results "weren't promising". That doesn't mean they couldn't get it to run ...but if you've got to re-write the engine to work on new architecture, deal with lower memory bandwidth, etc, and then expect low sales... What's the point?
 

freddy

Banned
I think it's pretty obvious at this point that while they may have had some problems technically the money and incentive from the higher ups at EA just wasn't there. EA is making huge cutbacks. If WiiU was flying out of the gate like Wii was then the engine would be running on it by now. I think EA have known the purse strings would be tightened for some time now and something in the way Nintendo was going to approach the next generation had them lukewarm to console support.
 

squidyj

Member
Ugh, can we please stop this? Every time a Wii U title performs badly it's the hardware blamed. Every time it performs better "Well, the game clearly wasn't CPU bound" or some other reason is given to excuse the performance.

no. if I remember correctly "lazy devs" were always blamed.
 
Before your crucification continues, note that the Fifa now-gone wii u team had some issues with Nintendo's online tools.

Wasn't that just the online FIFA trading card game?

EDIT: Found the quote

the Wii U version doesn't include FIFA Ultimate Team, the add-on virtual trading card game that is immensely popular among the series' gargantuan fanbase. Prior said this is the result of the Wii U's online offering being in its "infancy".

"We don't have Ultimate Team, purely and simply because Nintendo's online is in its infancy," he said. "It's building. FUT took five years to appear on 360 and PS3. They're very complex features. It's potentially something we could do further down the line. But in terms of initially getting the foundation set, that wouldn't have been technically feasible, because it is such a complex mode."
 

AzaK

Member
Why do you think a guy saying the WiiU CPU is worse than a phone CPU from 18 months ago vindicates Nintendo? That's more proof that the CPU is underpowered and is the likely stumbling block for multi-platform development.
It's worse at SIMD. Otherwise it's actually quite good.

Activision had to turn down Call of Duty's lighting effects and make other changes to get it running on Wii U.

They also got it running two separate views, one at TV res, one at GamePad res.
 

Schnozberry

Member
How the netcode worked on the WiiU? Seriously? Really?

I'm almost speechless. And sad for you..

The number of AI characters and multiplayer characters on screen cause a remarkably different CPU and GPU workload; anyone who ever gamed on PCs knows this. Just because it performs at X fps with Y number of AI characters doesn't mean it will perform at X fps with Y number of multiplayer characters.

That's what I was trying to point out. Perhaps the network infrastructure or poorly optimized net code was causing lag issues, and they opted for less players online to solve the problem. EA hasn't put much effort into Wii U, and the game is run through Origin, so I'm assuming they are hosting the online multiplayer.

Please save your insults for someone who doesn't just laugh at them. You're a terrible troll.
 
Before your crucification continues, note that the Fifa now-gone wii u team had some issues with Nintendo's online tools.

As someone noted before that wasn't a problem with FIFA's multiplayer and wasn't a netcode or bandwidth issue, it was not solved by halving player numbers.

It was basically WiiU's online services being 'pretty bare bone' at the time.

Having 12 cars in the same match does not require very sophisticated network services. It basically requires more bandwidth and hardware power.
 
Yeah, and Max Payne couldn't have been done on GameCube. Some developers just want every platform to be the same and have no interest in optimizing for unique architectures. "We ran some tests on a different engine" is hardly even an attempt to seriously answer this question about Frostbite 3.

The really sad thing is that Nintendo hasn't shown up with anything to prove them wrong about the system's capabilities. Either it really is a lump of shit, or even the system's creators aren't interested in pushing the metal.
 
The really sad thing is that Nintendo hasn't shown up with anything to prove them wrong about the system's capabilities. Either it really is a lump of shit, or even the system's creators aren't interested in pushing the metal.

Nintendo's first party titles have never been about 'pushing the metal'. If you're looking to Nintendo to "prove" 3rd party devs are lazy, you'll be looking a long time. That's just not what Nintendo is about when it comes to their own games.
 
Top Bottom