gofreak said:
It's of course not the only viable way, but it's the way developers have come to reasonably expect. The performance per watt with these other chips isn't, I presume, so dire that they're crying out for a more fixed but power efficient approach. Devs have come to reasonably expect they can have both the kind of programmability they're used to elsewhere while not impinging on workable power draw.
There's a portability issue. Yes, 3DS is its own dedicated device, developers can target it specifically in a way they wouldn't any one specific smartphone for example, but it's not an island. It, I'd say, would be a fairly big deal if it did happen to put off middleware support from the likes of Epic for example. I'd hope Nintendo might have consulted externally before making these choices, in which case this might not be a worry, but we'll have to wait and see.
Finally, I'm not sure what 'most of the ES 2.0 effects' means. I'm sure 3DS covers some common cases, but flick through the volumes that have been written in the likes of the GPU Gem series and you'll see that there's an awful lot of variety. You'll be able to map some stuff to what's provided by the 3DS GPU, but it's hard to talk about levels of coverage. This is before we even get to talking about doing other non-rendering work on these kinds of GPUs (which will happen soon if it hasn't already). More than that, though, I think there is a downside to this approach from the point of view of this being a dedicated device to last several years - there's a smaller scope for evolution in how the GPU is used. There's a smaller scope to apply new techniques and insights and experience that'll undoubtedly crop up in the next 5 years. So yes, devs can hunker down on 3DS and do things its way, but on the other hand there's not a huge amount of depth for them to explore in terms of focusing on it. The goalposts are mostly set from the start.
So while I can agree to a certain extent with the "does it matter if the 'expressive power' is good now?" sentiment, and I can agree with the power consumption argument, on the other hand there are downsides too. It is a double-edged sword, there are tradeoffs (or potential tradeoffs) and I could understand if people thought they were notable ones.
There's always tradeoffs and there'll never be an ideal solution and only time will tell whether Nintendo made the right choice. I can understand the desire for more general programmability but I also understand that power efficiency comes before all else
As for the "covers most of the use cases" I admit I was far too vague with that comment and you're right to call me out on it. What I really mean is that it really doesn't matter how many fantastic tricks and effects ES 2.0 compliant hardware allows developers can to create in isolation and present in white papers, in the end 90%+ of them are going to be a terrible fit for an actual shipping game engine, especially one which targets mobile hardware. Even on the PS3 and 360 where developers have much more shading power to play with than they do on any mobile platform, we still tend to see the same set of base shader effects used no matter the engine, with just a few extras here and there.
On mobile hardware the situation is even more bleak, we're still yet to see any example of an engine targeting ES 2.0 hardware that manages to pull off anything that wasn't possible on decades old fixed function hardware. Epic Citadel is the shining example for most but then what shader effects is that demo really pulling off besides some nice dot3 bump mapping, specular highlights and reflection/refraction effects? Heck, even Geforce 2 level hardware is capable of doing all of that, so the point is what use is that general programmability if the actual hardware isn't fast enough to take proper advantages of it in actual games? Epic Citadel isn't even a game ffs, its just a tech demo without character models!
I'd compare the scenario to something like the DC's dot3 bump mapping support, sure the hardware was technically capable of it but when the only time it was used was for a single crappy looking coin in Shenmue, is it really a feature worth screaming from the rooftops about?
I fully understand the merits of having "developer friendly" hardware and its one of the major reasons I'm such a fan of the Xbox 360's design. Microsoft delivered a piece of hardware that was not only fast and reasonably cheap to produce (at least in comparison to the PS3) but was simple for developers with a PC background to take advantage of from day 1 and the results speak for themselves. Having said that, I really don't think the 3DS is such an unfriendly system for developers. Its a huge leap forward compared to the Wii and you must remember that in the portable space Nintendo are coming from hardware with an N64 level featureset, even a design like Flipper or a Geforce 2 is a huge step forward considering that but PICA200 clearly goes way above and beyond that.
Now, do I feel like a similar design would be suitable for a home console? No, absolutely not, I think I've made my feelings quite clear on that subject and I expect Nintendo deliver at least an OpenGL 3.0 compliant device with their home hardware refresh in 2011. I just tend to feel that maybe mobile hardware made the switch to general programmability just a little too quickly. Is DX10.1 compliance really all that necessary for mobile platforms? The PC space didn't make the switch to unified shader and general programmability until we had 8800GTX levels of raw performance and mobiles are clearly a decade+ away from that.
I understand why the move was made though, most mobile hardware targets open hardware platforms where GPUs can and will chop and change from year to year so a wide general purpose high level shader language really is the only way to solve that problem but when you're not tied to that restriction then why not explore the alternatives? Call me naive but I honestly believe that the right compromise between general programmability and fast fixed hardware lies somewhere short of the mark of full unified shaders when we're talking about parts with such generally low performance and tight power constraints. We'll see who is right as this thing plays out but I'm at least pleased someone is taking a different approach to the problem as it makes things more interesting and opens this sort of discussion. I'm just not a fan of the belief that the march towards general programmability with GPUs is the only solution, especially in markets where performance per watt is the only metric worth a damn.
P.S.
Thanks Blu for the pearls of wisdom, really insightful!
![Big grin :D :D](data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7)
I didn't think this post would end up quite so long! :lol Hopefully
someone appreciates my babbling and its not all in vein, though I kinda doubt that! :lol
P.S.S I'm not doubting for one minute that there's a portability issue, mind you. I just don't think its as big a deal as some make it out to be after seeing the utterly fantastic work Capcom have done in so little time. MT Framework on 3DS really is such an incredible achievement and a huge step forwards for mobile graphics in general, imo.