GDJustin said:
This makes no sense. In-game ads are finally coming into their own, so gamers on a message board are shouting "omg lower development costs!" when in the topics above and below Gears of War hype is reaching new heights, an Obivion topic surpasses 10,000 replies, and Assassin's Creed is on everyone's mind/tongue.
How do you propose they go ABOUT lowering new dev costs, eh? No voice acting in the next final fantasy? Less texture artists in the next Elder Scrolls? Come on, man.
I love boutique games as much as the next guy. I mean shit, I RUN a handheld gaming websites - one of the few mediums game companies still can make a hit with a very small dev. team. I love "small" games. But at least I recognize that the industry at large, the majority of consumers, aren't like that. They want big blockbusters, summer-movie style.
Indie flicks like Lost in Translation can be hugely profitable, but they aren't as bankable as summer blockbusters like Spider Man, PotC, etc. The games industry is the same way.
All I'm saying is that with the release of new hardware, the most marketed features are the graphics. Developers and publishers have to try and keep up with the Joneses, so to speak, by continuously making higher and higher quality graphics. That's the reason new consoles have been developed (excluding the wii).
I think it's a widely known fact that graphics and audio are the most expensive parts of games. Why is that? Because developers and publishers always get pressured by the console manufacturers to increase the graphics.
What has been emphasized by MS and Sony? Sure, we've heard about the Sony Connect, and Velocity Girl, and stuff like that, but the MAIN thing they keep on touting is HIGH DEFINITION GRAPHICS!!!
It puts pressure on developers and publishers alike to make HIGH DEFINITION GRAPHICS that stand up to what the console manufacturers are touting as the reasons to buy their systems. Even when a dev may not want to spend that much money on graphics, some other studio releases a video like the Killzone CGI or the FFVII tech demo and they feel like they have no choice but to try and keep up with them, and for this, they have to spend several times the money they currently are.
Look at the last year or so of the PS2's games. While the GC and Xbox have been dead for the last year, the PS2 has gotten really incredible and unique games like SotC and Okami. I think that has to do with developers finally getting really comfortable with the hardware they have, so they are able to work on graphics, audio, and gameplay at a relatively equal pacing, whereas when you have new hardware, the newer, higher quality graphics seem to add on to the development cycle even after the rest of the game is completed (I think I recall hearing about stuff like this from the Kameo, PDZ, and PGR3 developers in the months leading up to the Xbox 360's launch, where they were spending the last few months of development working on solely the graphics of these games. Everything else was completed.) It seemed like the PS2, Xbox, and GC were coming into their own just when the plugs were pulled on them and the transition began towards next-gen. Conker, Ninja Gaiden Black, Okami, FFXII, and RE4 are all examples of this.
I know graphics are an important part of games; They're probably the second-most important aspect to most people in their games, and I love seeing stuff that wows me in games, but I think maybe it would be beneficial to the entire gaming industry if developers and publishers put a little bit of pressure on console manufacturers to slow down a bit on the hardware progression. It allows devs the time to reach a graphical mastery point of hardware and also provides them the time to develop a radical new game.
I'm not sure if what I'm trying to say is coming out right, but I think consoles hit their sweet spot around the 4 or 5 year mark, graphically. When devs reach that sweet spot, where they feel they've gotten as good as they can get graphically on current hardware, they can start pooling more of their assets and development time towards new gameplay ideas, and maybe their developmenet costs will come down as they'll be used to the APIs and architectures of each of the consoles and, thus, be able to complete games more quickly. This will save publishers money, too, and with system hardware reaching sweet-spot pricings of $99, $130, and/or $150, they'll have plenty of potential buyers for their games.
To sum it all up, extending the lifespans of systems is a good thing because:
#1: It slows down the rate of hardware prorgression in the video game industry, thus almost certainly slowing down the rate of growth for development costs that are mostly used on graphics and audio of games.
#2: It allows developers time to make games even after reaching the peak of a system's graphical capabilities.
#3: With developers all reaching graphical plateaus on systems, they can focus on one-upping each other through their games' gameplay. In other words, they can shift their efforts towards the development of new and creative gameplay elements
#4: Developers can pump out higher-quality games at faster rates than they could have earlier in the generation.
#5: Faster development cycles result in lowered development costs for publishers.
#6: Lower hardware prices result in mass-market penetration and large userbases, and, subsequently, more potential customers for a game, and, thus, more potential revenue for publishers and developers.
#7: Lower hardware production prices allow console manufacturers to further profit off of subsequent hardware purchases. The large userbases of their systems should, theoretically, result in more total software sales over a certain period of time (say, one year) than earlier in the hardware's life cycle, thus providing console manufacturers higher revenues and profits off of royalties.