• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS3 games list & SPE usages

Pistolero

Member
Oh Gosthunter is the elephant in the room nobody seems to notice. That was an amazing engine pusher, with superb lighting and textures...
Thanks for correcting me. I've also read that Eight Days was canned...Who knows though...
 

jax (old)

Banned
don't read/post as much as I'd like but this is one of the best/most informative reads on here that I've gone through in a while. kudos to MikeB
 

Madman

Member
Pistolero said:
Oh Gosthunter is the elephant in the room nobody seems to notice. That was an amazing engine pusher, with superb lighting and textures...
Thanks for correcting me. I've also read that Eight Days was canned...Who knows though...
Last we heard it was still alive. Who knows what happened since.
 
They should be able to get close to the concept video

eight-days-20060508112411013.jpg

eight-days-20060508112412059.jpg


People are always bugging Phil about Afrika, they should be asking him about this.
 

garrickk

Member
A while ago, when we were first learning of the Cell architecture, I thought it was postulated that eventually a library would be developed to allow one of the SPE's to function as a branch predictor for the PPU - allowing the PPU to more efficiently crunch "general purpose" code.

I'm sure a lot of this is predicated on how efficiently the system memory <-> Cell cache operates, how well Cell cache <-> SPE cache flows, the depths of the cores, etc. The raw processing of the SPE utlized for this may be squandered, but hey, there are plenty of them for most applications, right? I mean, if you the raw throughput of the PPU is increased and the loss of a single SPE isn't missed. What about people using Linux, etc?

I'm not sure where I read this theory - EETimes or perhaps just an ArsTechnica forum post from a member who sounded technically sound? For all I know, it's not at all feasible, or actual software engineers got their hands on the hardware and realized it was either too difficult to implement or offers little benefit. Just curious if anyone has heard anything.
 

jonabbey

Member
garrickk said:
A while ago, when we were first learning of the Cell architecture, I thought it was postulated that eventually a library would be developed to allow one of the SPE's to function as a branch predictor for the PPU - allowing the PPU to more efficiently crunch "general purpose" code.

I'm sure a lot of this is predicated on how efficiently the system memory <-> Cell cache operates, how well Cell cache <-> SPE cache flows, the depths of the cores, etc. The raw processing of the SPE utlized for this may be squandered, but hey, there are plenty of them for most applications, right? I mean, if you the raw throughput of the PPU is increased and the loss of a single SPE isn't missed. What about people using Linux, etc?

I'm not sure where I read this theory - EETimes or perhaps just an ArsTechnica forum post from a member who sounded technically sound? For all I know, it's not at all feasible, or actual software engineers got their hands on the hardware and realized it was either too difficult to implement or offers little benefit. Just curious if anyone has heard anything.

Given the granularity of the DMA transfers to the SPE's, I can't imagine it would be efficient to do. Effectively what you'd wind up doing would be having the SPE be reprogramming the PPE dynamically, but in order to do branch prediction, you'd need to know what the PPE had done on previous trips through the branch, which would have to involve some significant overhead in PPE->SPE communications.

Particularly when the PPE already has a perfectly serviceable branch predictor.
 

loosus

Banned
Does the processor in the Wii share any similarities with those of the 360 and PS3? Or, are they just entirely different beasts despite being made by IBM?
 

noonche

Member
garrickk said:
A while ago, when we were first learning of the Cell architecture, I thought it was postulated that eventually a library would be developed to allow one of the SPE's to function as a branch predictor for the PPU - allowing the PPU to more efficiently crunch "general purpose" code.

I'm sure a lot of this is predicated on how efficiently the system memory <-> Cell cache operates, how well Cell cache <-> SPE cache flows, the depths of the cores, etc. The raw processing of the SPE utlized for this may be squandered, but hey, there are plenty of them for most applications, right? I mean, if you the raw throughput of the PPU is increased and the loss of a single SPE isn't missed. What about people using Linux, etc?

I'm not sure where I read this theory - EETimes or perhaps just an ArsTechnica forum post from a member who sounded technically sound? For all I know, it's not at all feasible, or actual software engineers got their hands on the hardware and realized it was either too difficult to implement or offers little benefit. Just curious if anyone has heard anything.

The SPEs don't have cache. It's all programmer controlled, which is both an advantage and a disadvantage. I can't find anything in IBM's documentation on how deeply the SPEs are pipelined. They state that the PPU is around 20 stages deep. I kind of doubt the SPEs are that deep.

A lot of the design decisions made in Cell seem to be fairly similar to Itanium. Cell doesn't seem to have speculative loads, but it is massively parallel, doesn't have out-of-order execution and seems to rely heavily on programmers optimizing their code. Documentation on both architectures seems to indicate that a lot of this programmer optimization is supposed to be compiler aided, and at least in Cell's case, that magic compiler doesn't seem to exist.
 

PistolGrip

sex vacation in Guam
SolidSnakex said:
No that's Sony London, they're currently working on Singstar, and atleast had Eight Days in development if its still around. Cambridge is Ghosthunter and Primal

Ghosthunter
ghostHunter_040103_03.jpg

ghostHunter_040103_05.jpg
Wow the ghosthunter game looks amazing for a PS2 game.
 

MonkeyLicker

Art does not make 60FPS @ 1080i with real world physics on the PSf*ckin2.
MikeB said:
For this generation there's a much greater performance and specs gap, if compared to last gen.
If you're talking about the Wii, yeah. If you're talking about the 360 and PS3 you're wrong. These systems are the closest any 2 systems have ever been in the same generation.
 

garrickk

Member
alske said:
The SPEs don't have cache. It's all programmer controlled, which is both an advantage and a disadvantage. I can't find anything in IBM's documentation on how deeply the SPEs are pipelined. They state that the PPU is around 20 stages deep. I kind of doubt the SPEs are that deep.

A lot of the design decisions made in Cell seem to be fairly similar to Itanium. Cell doesn't seem to have speculative loads, but it is massively parallel, doesn't have out-of-order execution and seems to rely heavily on programmers optimizing their code. Documentation on both architectures seems to indicate that a lot of this programmer optimization is supposed to be compiler aided, and at least in Cell's case, that magic compiler doesn't seem to exist.
I thought the PPE had 512KB of cache and EACH SPE had 256KB of cache - all running at the full clock speed.

BTW, jonabbey's post above I think nailed it. The branch prediction in the SPE is sort of impossible without the SPE knowing precisely (or at least efficiently) which branches the PPE is making. I assumed that the PPE branch prediction was so meager and perhaps "obvious" that the SPE could guess/know what the PPE was doing and take the "wrong" branch everytime. Even if that was the case, there is a lot of overhead, multiple branches, and other problems. I hadn't really thought it through.
 

noonche

Member
garrickk said:
I thought the PPE had 512KB of cache and EACH SPE had 256KB of cache - all running at the full clock speed.

BTW, jonabbey's post above I think nailed it. The branch prediction in the SPE is sort of impossible without the SPE knowing precisely (or at least efficiently) which branches the PPE is making. I assumed that the PPE branch prediction was so meager and perhaps "obvious" that the SPE could guess/know what the PPE was doing and take the "wrong" branch everytime. Even if that was the case, there is a lot of overhead, multiple branches, and other problems. I hadn't really thought it through.

Each SPE has 256 kb of memory that is as fast as cache. It is not cache though. It's completely under programmer control. Check the diagrams on page 24 of the Cell Broadband Engine Architecture. It appears that there is a shared cache for all the SPEs, but that is likely for when you try to pull in data over EIB.
 

ROFL

Hail Britannia
Pistolero said:
Oh completly forgot about the Cambridge guys...Aren't they the ones responsible of The getaway ? The fact that they're helping propagating technical excellency through the veins of Sony studios won't tie their hands I hope. The first Gateaway was quite impressive, if short of some solid gameplay mechanics.
GoW III will come to stab every other self-imposed god once everyhting is said and done..

Getaway was done by my old mates Team Soho (or 'Sony London' - ugh). They released some very impressive early target renders yonks ago for a PS3 Getaway but I've not heard or seen anything since...

http://en.wikipedia.org/wiki/The_Getaway_(video_game)

<The Getaway 3
A third Getaway title is in development (with a working title ) of The Getaway 3 or The Getaway Working Title by Team SOHO for the PlayStation 3. A technical demo featuring Piccadilly Circus was demonstrated in May 2005, but this was not directly from the game. It has been confirmed that a third game would again be set in London; however, rumours have been circulating that the game could also feature a city in Europe, more than likely Amsterdam given the most recent trailer.>
 

gofreak

GAF's Bob Woodward
ROFL said:
Getaway was done by my old mates Team Soho (or 'Sony London' - ugh). They released some very impressive early target renders yonks ago for a PS3 Getaway but I've not heard or seen anything since...

It seems it is still in development, at least..there was some conference pretty recently, where one of the speakers from Sony had it listed in her bio that she was working on the story for The Getaway on PS3.

I'm guessing we'll hear more about it next year. I wonder how big/ambitious the project is, it seems like it's getting a fairly long gestation period.
 

jonabbey

Member
alske said:
The SPEs don't have cache. It's all programmer controlled, which is both an advantage and a disadvantage. I can't find anything in IBM's documentation on how deeply the SPEs are pipelined. They state that the PPU is around 20 stages deep. I kind of doubt the SPEs are that deep.

I believe the SPUs have variable pipeline length, depending on the operation. The longest operations take 7 cycles. The SPUs are also dual-issue, with certain operations on one pipe and the rest on the other.

A lot of the design decisions made in Cell seem to be fairly similar to Itanium. Cell doesn't seem to have speculative loads, but it is massively parallel, doesn't have out-of-order execution and seems to rely heavily on programmers optimizing their code. Documentation on both architectures seems to indicate that a lot of this programmer optimization is supposed to be compiler aided, and at least in Cell's case, that magic compiler doesn't seem to exist.

The SPUs have branch hinting, at least, and they will pre-fetch the hinted branch.

And, yes, the compilers aren't anything close to magic, but one nice thing about the SPU architecture is that you can focus your optimization in a very well understood environment, without variable memory access latency, for specific discrete tasks running uninterrupted.

If you had to do that kind of optimization image-wide for a multithreaded environment running on the PPU, say, you'd have a much harder time achieving near-optimal performance.
 

noonche

Member
jonabbey said:
I believe the SPUs have variable pipeline length, depending on the operation. The longest operations take 7 cycles. The SPUs are also dual-issue, with certain operations on one pipe and the rest on the other.



The SPUs have branch hinting, at least, and they will pre-fetch the hinted branch.

And, yes, the compilers aren't anything close to magic, but one nice thing about the SPU architecture is that you can focus your optimization in a very well understood environment, without variable memory access latency, for specific discrete tasks running uninterrupted.

If you had to do that kind of optimization image-wide for a multithreaded environment running on the PPU, say, you'd have a much harder time achieving near-optimal performance.

I totally agree. I was merely pointing out that both Intel and IBM promised compilers for their exotic chipsets, and that they never really materialized. Granted programming an SPE is probably much simpler then hand generating IA-64 assembler (yuck).

I think one of the larger headaches when working with SPUs would be the lack of synchronization primitives. This is part (most?) of the reason that the PPU exists. If they had some you could essentially treat the SPUs as a slightly wonky multicore processor. However, the lack of syncing probably significantly cut down on the silicon required to make Cell work.
 

MikeB

Banned
Added the following comments to the original post regarding Ratchet and Clank Future:

"We are continuing to build our Insomniac Engine and have made many improvements to it since Resistance: Fall of Man. The one huge focus for us has been moving more of our processes over to the SPUs on the CELL processor. This has allowed us to get our physics and effects systems running roughly four times faster than it did in Resistance at nearly double the framerate, which is something you can see in weapons like the Tornado Launcher."

Source: The New Zealand Herald"

http://www.nzherald.co.nz/section/story.cfm?c_id=5&objectid=10462728
 

DSWii60

Member
ROFL said:
Getaway was done by my old mates Team Soho (or 'Sony London' - ugh). They released some very impressive early target renders yonks ago for a PS3 Getaway but I've not heard or seen anything since...

http://en.wikipedia.org/wiki/The_Getaway_(video_game)

<The Getaway 3
A third Getaway title is in development (with a working title ) of The Getaway 3 or The Getaway Working Title by Team SOHO for the PlayStation 3. A technical demo featuring Piccadilly Circus was demonstrated in May 2005, but this was not directly from the game. It has been confirmed that a third game would again be set in London; however, rumours have been circulating that the game could also feature a city in Europe, more than likely Amsterdam given the most recent trailer.>

If you know anyone who is working on The Getaway, please get them to include more of South and West London. It really frustrated me that the nearest I could get to the areas I know were Buckingham Palace (to the south-west) and Hyde Park (to the west).
 

Gadfly

While flying into a tree he exclaimed "Egad!"
alske said:
I totally agree. I was merely pointing out that both Intel and IBM promised compilers for their exotic chipsets, and that they never really materialized. Granted programming an SPE is probably much simpler then hand generating IA-64 assembler (yuck).
..
I have looked at assembly output of C compiler generated code for Itanium and it is not too shabby.
 

MikeB

Banned
MikeB wrote:

Any info on Ninja Gaiden Sigma, DiRT, Tekken 6, Metal Gear Solid 4, Haze, Turok, Hot Shots Golf 5, etc regarding SPE usages?

I have now added Ninja Gaiden Sigma to the list:

17) Ninja Gaiden Sigma

"There are games that have thousands of enemies at once, and some of them don’t move or do much. For us it’s about making sure they have goals they are fulfilling and that they can work together with existing enemies. That’s really our philosophy. As far as A.I. goes people have played our game before will see that we’ve made some subtle improvements. A lot of it has to do with using all the Cell’s SPU processors."

Source: Next-Gen Gamer

http://nextgengamer.wordpress.com/2007/03/26/new-ninja-gaiden-sigma-interview/
 

MikeB

Banned
Also added the following quote to the orginal post regarding Ninja Gaiden Sigma:

"" One important thing about the PS3 is the seven SPUs in addition to the main processor, and using those is what allows you to get the best graphic quality. So we have an entire section of programmers which is assigned to figure out how to get those SPUs working on graphics to the fullest. Now we're actually at the point where day-to-day you can see the graphic quality improve before your eyes, so to speak. So if you look at the screenshots we have for you today -- if you look at those compared to what we put out at TGS, there's a big difference in terms of the textures and the atmosphere. You can see the steps we've made since then."

Source: 1up

http://www.1up.com/do/previewPage?pager.offset=0&cId=3156377
 

MikeB

Banned
I added a little comment, regarding Heavenly Sword development.

"Personally I really love the SPUs as they have exceeded our performance expectations and we've got a lot of them to play with."

Source: Eurogamer
 

Hammer24

Banned
Don´t you just hate it when your absolutely perfect speculational fanboy thread gets totally ruined by actual dev´s and programmers?
 

chubigans

y'all should be ashamed
Hammer24 said:
Don´t you just hate it when your absolutely perfect speculational fanboy thread gets totally ruined by actual dev´s and programmers?

Wow, you're an idiot.

And keep up the awesome work Mike. :)
 

MikeB

Banned
@ Hammer24

I take informational posts like the ones of alske f.i. over pointless fanboy/troll baiting posts anyday. If that makes me an idiot, so be it.

I find the comments provided by expert PS3 developers most informative, that's the main reason why I started this thread. Why not try to add something useful yourself, instead of trolling?
 

Hammer24

Banned
MikeB said:
@ Hammer24

I find the comments provided by expert PS3 developers most informative, that's the main reason why I started this thread.

Then why do you post PR speak? The forum posts here are way more believable than those.

Why not try to add something useful yourself, instead of trolling?

Simple: I am no programmer nor dev - I couldn´t add anything substantial to a technical topic like this. But I enjoy reading them.
Don´t you notice that you scare the experts away with your PR quotes?
 

dogmaan

Girl got arse pubes.
Hammer24 said:
Then why do you post PR speak? The forum posts here are way more believable than those.



Simple: I am no programmer nor dev - I couldn´t add anything substantial to a technical topic like this. But I enjoy reading them.
Don´t you notice that you scare the experts away with your PR quotes?


Mike B uses the quotes as evidence of SPU Usage, not to scare dev's away, or for PR purposes
 

Orlics

Member
dogmaan said:
Mike B uses the PR quotes as evidence of SPU Usage, not to scare dev's away, or for PR purposes

Don't pretty much all games use the PS3's SPUs though? I mean, PGR3 used all 3 cores of the 360, and that was a launch game.
 

MikeB

Banned
The quotes most often originate from the actual developers and not PR departments. You can find many similar comments from PS3 developers at the Beyond3D forum. IMO PS3 exclusive developers often have the best insight with regard to the PS3 technology as they often (re)design their gaming engines from scratch.
 

MikeB

Banned
@ Orlics

No they don't because game engines need to be redesigned to take advantage of the SPEs. More development time equals more costs, cross platform development is more difficult as the XBox 360 nor PC provide anything truly equivalent to a SPE. The Cell isn't just a multi-core CPU, the SPEs act more as independent processors than within a multi-core design.

I will post a new message within which I will compare the XBox 360 and PS3 to greater depth. I am sure it will wind up some fanboys though.
 

Busty

Banned
Hammer24 said:
Don´t you just hate it when your absolutely perfect speculational fanboy thread gets totally ruined by actual dev´s and programmers?


*goes to type furious reply*


Hammer24 said:
.....I couldn´t add anything substantial to a technical topic like this.

o_O


Hammer24
Owning himself,
one post at a time.

(Today, 02:11 PM)
Reply | Quote
 

Orlics

Member
MikeB said:
@ Orlics

No they don't because game engines need to be redesigned to take advantage of the SPEs. More development time equals more costs, cross platform development is more difficult as the XBox 360 nor PC provide anything truly equivalent to a SPE. The Cell isn't just a multi-core CPU, the SPEs act more as independent processors than within a multi-core design.

I will post a new message within which I will compare the XBox 360 and PS3 to greater depth. I am sure it will wind up some fanboys though.

Well, obviously a game ported from 360 to PS3 won't use the SPEs that well at all. However, the SPUs will be used for SOMETHING because the PPU alone can't run code that's made for the Xenos. I'm not saying they are being maximized; that's never the case for quickly-made ports, but the SPUs are a key component of the PS3's processor and it would seem weird that games that run on a 3-core (or dual core, for PC games) system would be easily ported to run on the Cell's PPU alone.
 

MikeB

Banned
@ Orlics

PPU alone can't run code that's made for the Xenos

I understand your perspective, but for example AFAIK no Unreal engine based games prior to the release of Gears of War actually used mutiple CPU cores on the XBox 360. For other games which require more processing power than just the PPE can provide I agree it makes sense to redesign the game engine for SPE usage, but it may be easier (or better said cheaper) for some developing companies to simply optimise the legacy code some more, running on solely the PPE and RSX, just look at the visual quality of Genji 2 to see that even with such an approach the PS3 is technically quite capable. Also note the PS3 has memory speed and data transfer advantages over the XBox 360 design, so many early ported games can get away with this approach.
 

MikeB

Banned
This comment seems to indicate that "The Darkness" does not use the SPEs:

"It depends on the type of engine you are doing. In The Darkness they have pretty similar performance, but that is very intentional from our side. We need the two platforms to perform similarly, and therefore we can’t design features that would take advantage of the difference of the two platforms. To my knowledge the PS3 has untapped potential in its seven SPUs"

Source: Edge online
http://www.edge-online.co.uk/archives/2007/02/you_interview_s_1.php
 

spwolf

Member
MikeB said:
This comment seems to indicate that "The Darkness" does not use the SPEs:

"It depends on the type of engine you are doing. In The Darkness they have pretty similar performance, but that is very intentional from our side. We need the two platforms to perform similarly, and therefore we can’t design features that would take advantage of the difference of the two platforms. To my knowledge the PS3 has untapped potential in its seven SPUs"

Source: Edge online
http://www.edge-online.co.uk/archives/2007/02/you_interview_s_1.php


that explains a lot about the demo :D
 

Gibb

Member
alske said:
Each SPE has 256 kb of memory that is as fast as cache. It is not cache though. It's completely under programmer control. Check the diagrams on page 24 of the Cell Broadband Engine Architecture. It appears that there is a shared cache for all the SPEs, but that is likely for when you try to pull in data over EIB.
From a HS dev's blog, regarding "Atomic Cache Unit":

The ACU (s) are a part of each SPU that allow atomic updates to occur very quickly. It appears fairly simple each SPU had 512 bytes of cache (yes contrary to what you might have heard SPU do have a tiny bit of cache). 512 bytes is divided into 4 128 byte lines.

check article here: http://blog.deanoc.com/?p=96
 

Busty

Banned
MikeB said:
This comment seems to indicate that "The Darkness" does not use the SPEs:

"It depends on the type of engine you are doing. In The Darkness they have pretty similar performance, but that is very intentional from our side. We need the two platforms to perform similarly, and therefore we can’t design features that would take advantage of the difference of the two platforms. To my knowledge the PS3 has untapped potential in its seven SPUs"

Source: Edge online
http://www.edge-online.co.uk/archives/2007/02/you_interview_s_1.php


Thanks for the update MikeB. The wording seems a little vague but quite possible considering the final game.

Didn't Starbreeze outsource the development (well, port if we're honest) of the PS3 to another developer?
 

MikeB

Banned
I have now added the following regarding Killzone 2 to the original post:

"We've created our own proprietary technology to drive the game, and this is using many of PS3's specific strengths.Large quantities of data can be streamed because we have a great deal of storage capacity. This allows for the level of detail you can see in the game.

It is not a luxury to have Blu-ray, but rather a necessity, as compression only gets you so far. I mean, the level that we showed at E3 and Leipzig topped out around 2GB! Also having the CELL and SPUs means we can offload all of our physics processing to an SPU, or process AI using the SPU's. All this processing power just means we can add more detail and create that Hollywood-type realism we're after."

Source: GamePro.com

I think the following article sheds more light on how the SPEs are currently used in Killzone 2, and addresses the advanced deferred rendering techniques Guerrilla Games already implemented for the game so far.

2883708070095088590S425x425Q85.jpg


Deferred Rendering in Killzone 2
 

MikeB

Banned
This may be of relevance to Ratchet and Clank: TOD or the recently announced sequel to Resistance Fall of man.

An interesting read, Insomniac regarding their new igPhysics system.

Introducing SPU shaders:
http://www.insomniacgames.com/tech/articles/0907/files/spu_shaders_introduction.pdf

Benefits of SPU Shaders to igPhysics
&#9679; Pipeline well defined and completely SPU-driven
&#9679; SPU processing completely asynchronous
&#9679; Data well-organized and well-defined.
&#9679; No (or minimal) PPU intervention
 

MikeB

Banned
With regard to Half-Life 2 The Orange Box, I expect to see no miracles,

Microsoft millionaire from the singletasking CLI-only MSDOS era and co-founder of Valve, Gabe Newell:

xkqff954n8tz.jpg


"I think the PS3 is a waste of everybody’s time. Investing in the Cell, investing in the SPE gives you no long-term benefits. There’s nothing there that you’re going to apply to anything else. You’re not going to gain anything except a hatred of the architecture they’ve created."

BTW, other lowend and higher end Cell based products are being planned, not for nothing are Cell processors being further developed, clocked lower as well as clocked higher than the version specced for the PS3. Developing with the SPEs in mind surely provides longterm benefits as the PS3 will be around for quite a while and a PS4 will almost certainly use a higher specced Cell processor as well.
 

PAYBACKill

Junior Member
MikeB said:
This may be of relevance to Ratchet and Clank: TOD or the recently announced sequel to Resistance Fall of man.

An interesting read, Insomniac regarding their new igPhysics system.

Introducing SPU shaders:
http://www.insomniacgames.com/tech/articles/0907/files/spu_shaders_introduction.pdf

Benefits of SPU Shaders to igPhysics
? Pipeline well defined and completely SPU-driven
? SPU processing completely asynchronous
? Data well-organized and well-defined.
? No (or minimal) PPU intervention


nice find.
 

chubigans

y'all should be ashamed
MikeB said:
With regard to Half-Life 2 The Orange Box, I expect to see no miracles,

Microsoft millionaire from the singletasking CLI-only MSDOS era and co-founder of Valve, Gabe Newell:

"I think the PS3 is a waste of everybody’s time. Investing in the Cell, investing in the SPE gives you no long-term benefits. There’s nothing there that you’re going to apply to anything else. You’re not going to gain anything except a hatred of the architecture they’ve created."

BTW, other lowend and higher end Cell based products are being planned, not for nothing are Cell processors being further developed, clocked lower as well as clocked higher than the version specced for the PS3. Developing with the SPEs in mind surely provides longterm benefits as the PS3 will be around for quite a while and a PS4 will almost certainly use a higher specced Cell processor as well.

Well it's a good thing Valve has nothing to do with the port of Orange Box to PS3. :)
 

MikeB

Banned
Some encouraging comments from Activision, but 4 to 5 years? I really don't think it will take this long, as most 1st and 2nd parties are already tapping into a good deal of the enormous additional performance potential regading their leading PS3 titles (R&C Future, Uncharted, Killzone 2, Final Fantasy XIII, etc). I don't think 3rd parties can afford to stay behind for this long with regard to enhancing game engines. They will start to look like incompetent in consumer eyes and nobody wants their company to be viewed as such.

Activision: PS3 "Most Advanced Gaming Platform"
http://www.psxextreme.com/ps3-news/2030.html

We don't expect every developer to love the PS3, but we do expect a wee bit of effort. There has been a lot of whining concerning Sony's new platform - it was the same way with the PS2 - but the whining has lessened over the course of 2007, and more developers are seeing the light. One of them (who wasn't "whining" to begin with) is Activision, who threw Sony some respect a few days ago.

Speaking at last week's Web 2.0 Summit conference in the U.S., Activision chairman and CEO, Robert Kotick, told an audience member that he thinks the PlayStation 3 is the "most advanced gaming platform available." At the same time, though, he had to add something we all knew was true- "few game developers were building products that take full advantage of the console's powerful, multicore processor." On the other hand, Kotick said the future was bright; all this would change in "the next four or five years." (quote source: *****)

Activision is a lot like EA; they simply make multiplatform games to get the most out of their investments. Obviously, the PS3 is included in most of their productions thus far, so maybe they're more comfortable than others with the system's complex architecture. Those who are tend to praise the console's potential while admitting it's very difficult to work with, and that's exactly what Activision is doing here. Well, we certainly can't wait for those "next four or five years," that's for sure.
 
Top Bottom