• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS3 games list & SPE usages

FirewalkR

Member
Hey Mike, don't know if you've seen Mike Acton's first post, before the aforementioned one, here:
http://www.neogaf.com/forum/showpost.php?p=10068493&postcount=4862

Mike Acton said:
Pimpwerx said:
That said, I wonder what a 1/60th of a second "lag" is going to look like to the eye. I mean, it's gotta be really hard to see. But the eye picks up on the smallest changes.

This is an evolution of the model we used for RCF and I'm pretty sure it was totally unnoticed there, as I'd expect. There's no visible "lag". For effects that one would expect to be "in sync" and on the same frame as something else (e.g. attached to a moving part), they'd likely be classified as more the "immediate" type.

"Immediate" in this context means "the work will be done somewhere later in the frame" (so it's still technically "deferred", just not by a huge amount)

"Deferred" in this context means "the work will be done somewhere later than that, probably the next frame"

Kinan said:
Biggest news is the use of deferred rendering in R2, which means that we may expect KZ2 quality lighting in the sequel. Cant wait for the first footage.

You can not infer "deferred rendering" a la Killzone2 from this use of the word "deferred", as filopilo pointed out. It just means "done later" - and it's used everywhere as part of normal asynchronous design. In this sense we have deferred updates for collision, animation, physics, effects, water and a ton of other things.

I'm not sure this entirely relates to the context of SPE usage in this thread, but it's interesting nonetheless, in case you didn't see it. Oh and, it's Mike Acton, not Action. Unless you're doing it on purpose, which I entirely understand because it's so cool. :D
 

3rdman

Member
TAFKAA said:
whine.jpg
This whole thread is ridiculous and as a PS3 owner its just embarrassing but please feel free to continue your object worship.
 

MikeB

Banned
FirewalkR said:
I'm not sure this entirely relates to the context of SPE usage in this thread, but it's interesting nonetheless, in case you didn't see it. Oh and, it's Mike Acton, not Action. Unless you're doing it on purpose, which I entirely understand because it's so cool. :D

Oh, thanks and corrected. He made me think of Action somehow. :D

@ 3rdman

Any kind of Cell worship for this thread is welcome, be it PS3 games, Black Hole Collisions research or a future Cell phone. :lol
 

Sol..

I am Wayne Brady.
AgentOtaku said:
dude...WTF

...really? how is it worship?

lol, the ill will still amazes me. So mikeb consolidated alot of information on PS3 development and now we are crazed worshipers rather than people who just want a little insight on how they do shit.
 

wazoo

Member
Red Faction's real-time destruction

Red Faction: Guerrilla producer tells VideoGamer.com dev team is "finding some of the challenges of the PS3 beneficial".

The PS3 has the potential to achieve better graphics than the Xbox 360, a Volition developer has told VideoGamer.com.

Speaking to VideoGamer.com at THQ's Gamers Day in San Francisco, California, this week, Jeff Carroll, associate producer in charge of multiplayer and PS3 on upcoming third-person action game Red Faction: Guerrilla, said that while developers will struggle with the PS3 at first, once they "figure it out" they will find "a very powerful system underneath".

Red Faction: Guerrilla, the third in the popular series, sees a switch from the FPS view seen in the first two games to a third-person view. Scheduled for a simultaneous Xbox 360, PS3 and PC release in THQ's fiscal year 2009, the game has come on leaps and bounds since its first-look public unveiling earlier in the year.

When asked if there would be any differences between the Xbox 360 and PS3 versions, Carroll replied: "We've got the PS3 running at a similar frame rate to the 360. We're finding some of the challenges of the PS3 beneficial. We feel like we're doing rather well with the PS3."

Carroll went into more detail on the benefits of the PS3: "We have been getting the physics system, the Havok system and our own proprietary destruction system running on the SPUs, as well as our animations. We're seeing quite a decent speed improvement now that we've effectively got that running on the SPUs instead of the CPUs."

Carroll agreed, however, that many developers have found the PS3 to be a challenging platform, and admitted that Volition shared that struggle, at first.

"We were like that at the beginning. But I think mainly because we had to get our destruction system running on the SPUs, we had to get it running well in order to succeed, we felt a tremendous pressure right at the beginning to learn as much as we could about the PS3. And our programmers have done a wonderful job."

When asked whether it is possible to achieve better graphics on the PS3 compared with the Xbox 360, Carroll said: "I think potentially you will. I think early on people are going to be struggling to figure out exactly how to make the PS3 work. But I think once they do they will see it's a very powerful system underneath. They both have their advantages. The Xbox 360 pushes polygons rather well, which for us, because everything is destructible, that means that we have a tremendous number of polygons in our game compared to regular games, we see a great benefit with the Xbox 360, but we just learn to optimise in different ways for the PS3."

In Red Faction: Guerrilla, players will be able to destroy every building they see through a combination of weapons, vehicles and explosives. Real world physics are applied to the structures, often leading to a collapse through real-time structural stress.

Carroll added that the game is "very much" pushing the PS3, "both in terms of memory and processing power".

You can see for yourself how Red Faction: Guerrilla is shaping up over at our hands-on preview, and cast your eyeballs over some brand new screens hot off the press.

Red Faction: Guerrilla is due out some time during THQ's 2009 fiscal year for Xbox 360, PS3 and PC."
 

MikeB

Banned
@ wazoo

Yes, some interesting (yet IMO unsurprising) comments. Crytek also shared some interesting comments recently:

"The surprising thing has been how well the consoles can perform visually, once this tailoring is in place. We expect the final outcome will result in games that look like they're running at high settings, or nearly high settings, on a PC. Actually, we found it as much or more challenging to address the memory limitations of the consoles when converting our current AI system, as we did while converting our rendering engine or physics system, which was not something you might have expected at the start."

Graphically and performance wise the porting of their engine seems to be going smooth. AI like we heard from other past cross platform developers is more of a challenge. In the case of the PS3 it would require significant redesign to use the SPUs well for AI. But like Insomniac stated using the SPUs means being able to do more stuff simultaneously in the end.

"Well there is no doubt that porting our engine to the PlayStation 3 is the more challenging of our two ongoing conversion projects, but that works to our advantage in the end. We feel certain we have the ability to get the most that is possible out of that platform, and therefore PS3 games which run on our engine in the future will definitely stand significantly apart from other games that don't."

This part seems to relate back to efficient engine design and structuring talked about earlier within this thread, in the end benefitting the game engine in general, so also PC/360 ports. Judging from presentation sheets, the game engine already seems heavily multi-threaded so it could probably be redesigned into an excellent fit for the Cell processor.

Complete interview here:
http://uk.pc.ign.com/articles/864/864970p1.html?RSSwhen2008-04-07_135300&RSSid=864970
 

PistolGrip

sex vacation in Guam
I will be interesting to see what they can do but like all engine verndors, I would be weary about what they promise.
 

MikeB

Banned
I have added the latest Mike Acton development Q&As to the original post:

Insomniac Q&As on how they try to help 3rd party developers and why some developers are still struggling with the Cell archicture:

Q&A: Insomniac's Mike Acton - Part 1
Q&A: Insomniac's Mike Acton - Part 2

"What I've always said is that bad code, and bad data design in particular, is bad on any architecture, but it's particularly bad on the PS3 because the Cell is a much more modern, much more heterogeneous design. It's much more parallel, and so requires good data design and good code. So if you're poorly designing your data and your code, then yeah, I can see why it'd be difficult to take something like that and try and manipulate it to work on the PS3, especially when people have invested a huge amount of money and time on something that basically doesn't fit a modern methodology. Yeah, it's going to be time-consuming to get that to work - if it's at all possible."

"It's interesting, because I think that probably the oldest programming methods are the most relevant today. It's the habits over the last five or eight years that are struggling, and it's interestingly the people that are more recently out of school that are going to have the most trouble, because the education system really hasn't caught up to how the real world is, how hardware is changing and how development is changing."
 

Gibb

Member
Warrior300 said:
I wonder how much a game like MGS4 uses?

Kojima: "We're using the Cell engine to its limit"

but, based on Insomniac's Mike Acton recent tech talk, it's all about optimization.. so I guess he should have stated "its current limit" :D
 

MikeB

Banned
Gibb said:
Kojima: "We're using the Cell engine to its limit"

but, based on Insomniac's Mike Acton recent tech talk, it's all about optimization.. so I guess he should have stated "its current limit" :D

Yes, I think he just had unrealistic expectations with regard to what can be achieved within MGS4's development timeframe. Looking at the PS3 specs on paper it's easy to become over ambitious, but these raw specifications will not highlight the challenges with regard to the development process. Top results can only be achieved (combined with R&D) by either building a game engine from scratch or still much time and effort consuming redesign of most legacy game engines.

Kojima:

"Game-wise, it's pretty close to the original vision: you sneak into the battlefield and can choose whether to do a stealth game or interfere with the battle more directly. But the graphic, side things like motion-blending and the size of the map, totally was not accomplished to my original vision - to my satisfaction." Source: UK's Edge magazine

GamePro hands-on:

173848-9-1.jpg


"The graphics: You thought Gears of War looked amazing? Think again. Metal Gear Solid 4 will blow you away with its ultra-detailed characters and intricate environments, painting some of the most gorgeous graphics seen this side of Crysis. In motion, the graphics look so realistic that your eyes begin to register the visuals as a movie rather than a game. From what we saw, MGS4 puts its 50GB Blu-ray disc to outstanding use."

http://www.gamepro.com/sony/ps3/games/previews/173848.shtml
 

FirewalkR

Member
Hah, I knew this thread was due for an update, after all the recent Insomniac stuff. :)

MikeActon said:
"It's interesting, because I think that probably the oldest programming methods are the most relevant today. It's the habits over the last five or eight years that are struggling, and it's interestingly the people that are more recently out of school that are going to have the most trouble, because the education system really hasn't caught up to how the real world is, how hardware is changing and how development is changing."

I agree completely and couldn't help but smile at his comment about Office performing slowly. I've said many times, talking to friends, that the machines we've got nowadays should be able to perform way faster with current OS's. I mean, back in the Amiga and PC 286/386 days, we had processors performing at around 30 MHz or less, and they ran stuff quite well. Today's computers have processors working at 100x the clock speed. I know real performance doesn't scale proportionally, but still these machines are almost unbelievably faster and have access to a lot more memory.

Obviously, modern OS's are usually doing much more stuff, and running more programs in parallel, and in such complex systems, programmers must have in mind code readability, maintanability, and reuse, and very importantly, compatibility among a multitude of hardware, but this isn't new. Even so, modern machines should, in my opinion, perform much much better. And this is the result of the total lack of this "culture of optimization". Long gone are the days of thousands of lines of assembly code in game/demoscene/OS programs, and increased complexity forced this to happen, also with the appearance of visual tools that do half the work for you, but it feels like we are now at the other end of the spectrum.

And this brings me back to modern console/multicore PC development. There are no "magic tools" to extract performance out of these architectures, and people have no training on how to use them. Parallel processing was until recently the domain of supercomputer applications and academic research, and most people, fresh out of college, are unprepared for this and the kind of optimization mentality it requires, and are sort of "thrown to the wolves", and this results in what I feel are very low performance levels.

I think in a few years the game industry is going to advance programming for parallel architectures more than it has advanced in the last few decades. And it's funny that Mike Acton, a game developer, is so aware of this, because I've believed for a long time that the game industry has the best programmers, especially in low level programming. And if even these people are struggling, I wonder what will happen soon in other areas of programming, now that PC architectures ceased to increase clock speed and went multi- and many-core.

Damn, that was long, I hope it makes some sense because right now I don't feel like proof-reading. I almost don't remember what my point was. :lol Back to work now. :p

Gibb said:
Kojima: "We're using the Cell engine to its limit"

but, based on Insomniac's Mike Acton recent tech talk, it's all about optimization.. so I guess he should have stated "its current limit" :D

Yeah, he probably meant they're using the Cell as much or as well as anyone else. Sony obviously gave them as much support as they wanted so they should be on par with all the 1st parties.
 

MikeB

Banned
I added the following comments with regard to Resistance 2 and Far Cry 2 (mentioned earlier) to the original post and post #32.

Resistance 2:

"For example, the physics, animation, glass, inverse kinetics, effects, and geometry database systems (just to start with) are now less complicated, thus offering more and significantly faster features than the versions found in Resistance 1.

We've also solidified some design patterns that are simplifying things. Take SPU Shaders, for example, which we discuss in detail on our newly established R&D site. SPU Shaders helped to make the big systems and all the little changes that come along during development a lot more practical to implement. They've also helped shed some light on programming the SPUs. Just having the ability to start putting high-level logic and AI on the SPUs was a major milestone that validated a lot of our ideas on how to distribute that type of work."

http://blogs.guardian.co.uk/games/a...iac_on_ps3_and_nextgen_game_development_.html

17) Far Cry 2

"The R&D revealed some pleasant surprises, as Guay explained: "One thing that we realized pretty quickly as we started R&D on PS3, was that the hardware architecture had a very nice fit with some of our technical design decisions. We were positively surprised by how efficient the SPUs (the Cell processing units) were to do such things as run our vegetation simulation, our animations or our physics systems."

Guay also expressed how impressed he has been with Blu-ray and the PS3's hard drive, noting: "The hard drive and Blu-ray are making our life easy considering FC2 is an open world continuously streamed around the player. That streaming bandwidth and disk space is very appreciated."

Source: VideoGamer.com
 

reakt

Member
MikeB said:
6) Super Stardust HD

"We are able to get over 10,000 active objects with physics and collisions and over 75,000 particles simulated and drawn @60fps. That said, we were unable to use all the available processing power from Cell for this game, so for the next game there are still plenty of reserves left"
.. for the next game? Sounds interesting! I didn't think it could get any better than it already is!
 

MikeB

Banned
FirewalkR said:
I agree completely and couldn't help but smile at his comment about Office performing slowly. I've said many times, talking to friends, that the machines we've got nowadays should be able to perform way faster with current OS's. I mean, back in the Amiga and PC 286/386 days, we had processors performing at around 30 MHz or less, and they ran stuff quite well. Today's computers have processors working at 100x the clock speed. I know real performance doesn't scale proportionally, but still these machines are almost unbelievably faster and have access to a lot more memory.

Agreed, some far more efficiently designed multi-media orientated desktop OS designs were BeOS (inspired by Amiga) and AmigaOS, you can still do a lot more stuff simultaneously on these OSes with much less performance and memory requirements and still have a fully responsive system.

BeOS could have been made into an excellent fit for the PS3 hardware. The OS was built up from scratch for multi-processing, the original BeBox had several full processors (an approach inspired by Amiga hardware which had a multi-processing custom chipset taking workload off the 68k CPU) and the OS and many BeOS (non-ported) applications were designed from the ground up to make good use of multi-threading. Sadly when BeOS was later ported to the PC, there was little benefit as the OS was just too much ahead for its time as PCs were still mostly single core computers at the time and there were many not so great non multi-threaded and not very efficiently designed Linux software ported to the system.

AmigaOS4 today is still really fast on modest hardware (video demo below is an older version running on an old 800 Mhz G3 AmigaOne computer, but the latest version even runs on decades old upgraded 166 Mhz Amigas with only 64 MB of Ram):

http://youtube.com/watch?v=qSA-q1qniMY
 
MikeB said:
Yes, I think he just had unrealistic expectations with regard to what can be achieved within MGS4's development timeframe. Looking at the PS3 specs on paper it's easy to become over ambitious, but these raw specifications will not highlight the challenges with regard to the development process. Top results can only be achieved (combined with R&D) by either building a game engine from scratch or still much time and effort consuming redesign of most legacy game engines.

Kojima:

"Game-wise, it's pretty close to the original vision: you sneak into the battlefield and can choose whether to do a stealth game or interfere with the battle more directly. But the graphic, side things like motion-blending and the size of the map, totally was not accomplished to my original vision - to my satisfaction." Source: UK's Edge magazine

GamePro hands-on:

173848-9-1.jpg


"The graphics: You thought Gears of War looked amazing? Think again. Metal Gear Solid 4 will blow you away with its ultra-detailed characters and intricate environments, painting some of the most gorgeous graphics seen this side of Crysis. In motion, the graphics look so realistic that your eyes begin to register the visuals as a movie rather than a game. From what we saw, MGS4 puts its 50GB Blu-ray disc to outstanding use."

http://www.gamepro.com/sony/ps3/games/previews/173848.shtml

Is this gamefaqs - I thought we avoid quoting GamePro... :D
 
FirewalkR said:
Yeah, he probably meant they're using the Cell as much or as well as anyone else.

Well it's like racing a line in GT, everyone's going to push their ferrari as hard as possible but due to different drivers' intelligence and skill sets, you're going to get wild differences in performance times.
 
Gibb said:
Kojima: "We're using the Cell engine to its limit"

It sounds impressive, but Kojima is a designer, not a programmer. If the lead programmer of kojipro said this, I might believe it. I think Kojima is just trying to say it looks amazing.
 

MikeB

Banned
Current and future European PS3 developers may want to participate in this "DevStation 2008" event:

"London - 10th and 11th of June

Presentations focus around the core technologies and features of PS3™ and for the first time, we are providing content for all disciplines (design, production, art, audio and programming), with focus on physics, SPU optimization and audio tricks right through to the latest developments in the PLAYSTATION®Network."

"Presentations will focus on the core technologies and features of PS3™ with content on:

* Graphics
* Effective Cell and SPU utilization
* Profiling
* Debugging
* Performance optimization through to the latest developments with the PLAYSTATION®Network
* Real world case studies from guest developers."

http://devstation.scee.com/
 

shantyman

WHO DEY!?
akachan ningen said:
It sounds impressive, but Kojima is a designer, not a programmer. If the lead programmer of kojipro said this, I might believe it. I think Kojima is just trying to say it looks amazing.

He is the director of the game. Why would you think he wouldn't know this?
 

SRG01

Member
FirewalkR said:
I agree completely and couldn't help but smile at his comment about Office performing slowly. I've said many times, talking to friends, that the machines we've got nowadays should be able to perform way faster with current OS's. I mean, back in the Amiga and PC 286/386 days, we had processors performing at around 30 MHz or less, and they ran stuff quite well. Today's computers have processors working at 100x the clock speed. I know real performance doesn't scale proportionally, but still these machines are almost unbelievably faster and have access to a lot more memory.

Obviously, modern OS's are usually doing much more stuff, and running more programs in parallel, and in such complex systems, programmers must have in mind code readability, maintanability, and reuse, and very importantly, compatibility among a multitude of hardware, but this isn't new. Even so, modern machines should, in my opinion, perform much much better. And this is the result of the total lack of this "culture of optimization". Long gone are the days of thousands of lines of assembly code in game/demoscene/OS programs, and increased complexity forced this to happen, also with the appearance of visual tools that do half the work for you, but it feels like we are now at the other end of the spectrum.

And this brings me back to modern console/multicore PC development. There are no "magic tools" to extract performance out of these architectures, and people have no training on how to use them. Parallel processing was until recently the domain of supercomputer applications and academic research, and most people, fresh out of college, are unprepared for this and the kind of optimization mentality it requires, and are sort of "thrown to the wolves", and this results in what I feel are very low performance levels.

I think in a few years the game industry is going to advance programming for parallel architectures more than it has advanced in the last few decades. And it's funny that Mike Acton, a game developer, is so aware of this, because I've believed for a long time that the game industry has the best programmers, especially in low level programming. And if even these people are struggling, I wonder what will happen soon in other areas of programming, now that PC architectures ceased to increase clock speed and went multi- and many-core.

Damn, that was long, I hope it makes some sense because right now I don't feel like proof-reading. I almost don't remember what my point was. :lol Back to work now. :p

To reference a Cell development course I linked to a few months ago, Cell programming really is a throwback to the old days of programming where you need a higher level of understanding of the underlying hardware instead seperating the software and hardware layers from each other.

For myself, Cell programming is very much like embedded programming; although a lot of things can be implemented through libraries, you still need a lot of hardware knowledge to get the most out of it.
 
SRG01 said:
For myself, Cell programming is very much like embedded programming; although a lot of things can be implemented through libraries, you still need a lot of hardware knowledge to get the most out of it.
Console development has always been like this. Even on the relatively n00b friendly XBox / 360.
 

AKS

Member
MikeB said:
Yes, I think he just had unrealistic expectations with regard to what can be achieved within MGS4's development timeframe. Looking at the PS3 specs on paper it's easy to become over ambitious, but these raw specifications will not highlight the challenges with regard to the development process. Top results can only be achieved (combined with R&D) by either building a game engine from scratch or still much time and effort consuming redesign of most legacy game engines.

Kojima:

"Game-wise, it's pretty close to the original vision: you sneak into the battlefield and can choose whether to do a stealth game or interfere with the battle more directly. But the graphic, side things like motion-blending and the size of the map, totally was not accomplished to my original vision - to my satisfaction." Source: UK's Edge magazine

GamePro hands-on:

173848-9-1.jpg


"The graphics: You thought Gears of War looked amazing? Think again. Metal Gear Solid 4 will blow you away with its ultra-detailed characters and intricate environments, painting some of the most gorgeous graphics seen this side of Crysis. In motion, the graphics look so realistic that your eyes begin to register the visuals as a movie rather than a game. From what we saw, MGS4 puts its 50GB Blu-ray disc to outstanding use."

http://www.gamepro.com/sony/ps3/games/previews/173848.shtml

Definitely. Many seem to be missing the fact that Kojima's standards != normal standards. Critics have gone nuts over MGS4's graphics. Gamepro is one example, but it is hardly an isolated incident. I mean anyone can just take a look at what's there and see it looks sensational.

Also, as good as it looks, I wouldn't be surprised if the sound was just as good or better. Think of how good MGS3 sounded on the ancient PS2. I can't wait to hear this game in surround sound. It could be the best sounding game to ever hit a console.
 

MikeB

Banned
@ AKS

Many seem to be missing the fact that Kojima's standards != normal standards.

He designed 3 big MGS games before this, when he closes his eyes at night he must be dreaming of some amazing battlegrounds.

Back in the 80s as a kid I sometimes dreamt of what computing would become like by the year 2000. Compared to my expectations at the time computing advancements have been very dissapointing, especially PC advancements.

Back in the 80s operating system bootup times were near instant or at most it took only a few seconds to load a full blown mouse controlled desktop environment from the harddrive. With all the technology advancements I envisioned I imagined computing to have become near instant just like turning on a TV, actually I thought turning on the computer would allow you to instantly continue your work at the point you turned off the system. So if you would be painting a picture, turn off the system, then turn it on again you should just be able to continue where you left off.

Instead what we got today isn't much different than was already possible in the 80s. One of the most profound user experience advancements for PC users is probably multi-tasking, but copy & pasting between word processor, spreadsheet, database, paint program was already possible on my Amiga from the 80s. Internet advancements so far were also unsurprising and underwhelming, in the early 90s I read a lot of message boards and downloaded lots of Amiga games from the internet), playing games online today isn't that different from playing 80s Amiga games like Stunt Car Racer (a game at its core pretty similar to for example Motorstorm) or Battle Chess over nullmodem cable.

I would have envisioned games like Gran Turismo to become much more than glorified version of Testdrive with hyper realistic graphics (a '3D' driving game from incar perspective from the 80s). I would have imagined small high resolution virtual reality glasses with stereo sound and microphone by now, being able to play a game like Motorstorm of GT5 and watch besides you to watch out of a virtual car window, similar for watching movies from within a virtual cinema with friends, etc. After all there were already Amiga based game systems supporting virtual reality headset and hand tracking, being able to play 4 player 3D first-person shooters in Deatch Match and Capture the flag modes well before Wolfenstein3D or years before Doom got released for the PC.

Creating videos I could do on the fly in the 80s, adding subtitles and special effects to recordings was actually easier. IMO the advancements we have seen in computing since the early-mid nineties have been lacklustre to say the least. Windows XP still does not always respond to my user input like my 7 Mhz 1 MB Amiga from the 80s always did when multi-tasking several applications.

I can understand Kojima may be dissapointed as the Cell offers so much potential for the stuff he probably envisioned. Sadly the reality seems to be that things envolve at a much slower pace than many would want technology to advance. Especially PC technology for which the user experience is quite horrible compared to what I'd imagined. Windows Vista instead of being more optimised and efficient seems even worse than XP in this regard, constant push for hardware upgrades, virus / trojan threats, etc, etc.
 

DCharlie

And even i am moderately surprised
Sadly the reality seems to be that things go at a much slower pace than many would want technology to advance. Especially PC technology for which the user experience is quite horrible compared to what I imagined.

Although PC -tech- itself is now moving very very fast. I only have to look at the machine i bought 3 months back compared to what i can get now for the same money and the difference in performance is pretty stark.

Though it's hamstringed by some issues with Vista, i think 'horrible' is a wee bit ott.
 
sakuragi said:
From what I understand with my lack of developing knowledge, Developers don't need to take advantage of the cell SPE or whatever for multiplatform game since there isn't a need for them. Like valve mentioned, they didn't need to use any of the cells SPE for half life to orange while they had to use 90 something percent of the Xbox 360's power capacity. Which from this quote, indicates that even without the cell's SPE, the PS3 is as powerful as the Xbox 360. However, if they did use all of the PS3's capabilities, the PS3 is indeed leaps and bounds ahead of the Xbox 360 in terms of power.
what a silly post, devs have posted saying that the difference in cpu is 20%
hardly leaps and bounds. where as 360's gpu has its own memory and better texture support (show me a game on ps3 with textures like pd0 and kameo) so we have different strengths on each console. with ps3 being better a running physics and xbox360 at textures.
 

DeadGzuz

Banned
Feasible_Weasel said:
where as 360's gpu has its own memory and better texture support (show me a game on ps3 with textures like pd0 and kameo) so we have different strengths on each console. with ps3 being better a running physics and xbox360 at textures.

The RSX has a dedicated 256MB of RAM, the Xenos has a 10MB eDRAM buffer so it does not get starved by the shared memory. What is "better texture support"? Both machines support the same size textures and can apply the same type of shaders. Have you not seen Uncharted? Making everything shiney is not a sign of "better texture support". Why did I click on this thread?
 

DCharlie

And even i am moderately surprised
let's not make this another X360 vs PS3 thread, there are enought of them elsewhere.
 
DeadGzuz said:
The RSX has a dedicated 256MB of RAM, the Xenos has a 10MB eDRAM buffer so it does not get starved by the shared memory. What is "better texture support"? Both machines support the same size textures and can apply the same type of shaders. Have you not seen Uncharted? Making everything shiney is not a sign of "better texture support". Why did I click on this thread?
the gpus are different....and yes i own unchartered and texturly compare it to a xbox360 game and that would be tombraider legend.
from my observations of owning both consoles the 360 has totally bump map the whole world, wheres ps3 can only do say the track and vehicles.
 

dogmaan

Girl got arse pubes.
MikeB said:
@ AKS



He designed 3 big MGS games before this, when he closes his eyes at night he must be dreaming of some amazing battlegrounds.

Back in the 80s as a kid I sometimes dreamt of what computing would become like by the year 2000. Compared to my expectations at the time computing advancements have been very dissapointing, especially PC advancements.

Back in the 80s operating system bootup times were near instant or at most it took only a few seconds to load a full blown mouse controlled desktop environment from the harddrive. With all the technology advancements I envisioned I imagined computing to have become near instant just like turning on a TV, actually I thought turning on the computer would allow you to instantly continue your work at the point you turned off the system. So if you would be painting a picture, turn off the system, then turn it on again you should just be able to continue where you left off.

Instead what we got today isn't much different than was already possible in the 80s. One of the most profound user experience advancements for PC users is probably multi-tasking, but copy & pasting between word processor, spreadsheet, database, paint program was already possible on my Amiga from the 80s. Internet advancements so far were also unsurprising and underwhelming, in the early 90s I read a lot of message boards and downloaded lots of Amiga games from the internet), playing games online today isn't that different from playing 80s Amiga games like Stunt Car Racer (a game at its core pretty similar to for example Motorstorm) or Battle Chess over nullmodem cable.

I would have envisioned games like Gran Turismo to become much more than glorified version of Testdrive (a '3D' car game from incar perspective) from the 80s with hyper realistic graphics. I would have imagined small high resolution virtual reality glasses with stereo sound by now, being able to play a game like Motorstorm of GT5 and watch beside you to watch out of the window, similar for watching movies from within a virtual cinema, etc. After all there were already Amiga based game systems supporting virtual reality headset and hand tracking, being able to play 4 player 3D first-person shooters in Deatch Match and Capture the flag modes well before Wolfenstein3D or years before Doom got released for the PC.

Creating videos I could do on the fly in the 80s, adding subtitles and special effects to recordings was actually easier. IMO the advancements we have seen in computing since the early-mid nineties have been lacklustre to say the least. Windows XP still does not always respond to my user input like my 7 Mhz 1 MB Amiga from the 80s always did when multi-tasking several applications.

I can understand Kojima may be dissapointed as the Cell offers so much potential for the stuff he probably envisioned. Sadly the reality seems to be that things go at a much slower pace than many would want technology to advance. Especially PC technology for which the user experience is quite horrible compared to what I imagined. Windows Vista instead of being more optimised and efficient seems even worse than XP in this regard, constant push for hardware upgrades, virus / trojan threats, etc, etc.

I think Hardware abstraction is one of the main reasons behind this, also Microsoft's philosophy on adding bloatware to its OS, rather than trimming the fat, isn't helping PC OS performance, technically if I turn off all the new eyecandy in vista it should run faster than XP

it doesn't

you'd think Microsoft would of spent some of the vista development time actually optimizing their OS

Hardware abstraction isn't necessarily a bad thing, as it makes development a hundred times easier for developers, but with each iteration of hardware and software, developers are getting further away from the 'metal'

Perhaps as parallel processing matures we may end up with programming languages that are designed for mutilthreading from the ground up and compilers that are able to take your code and automatically compile it, so the performance scales linearly with the amount of cores you have ( I believe IBM are working on an Octopiler or something)

Another option is complete abstraction of platform, using say 30+ CPU cores with a massive shared cache (doesn't need to be x86) that are fully programmable, but not overly complicated, you then throw whatever code you want at it x86, PPC, etc, and the code is dynamically recompiled or interpreted, kind of like the old transmeta processors, I also believe Intel is working on a single thread emulator, to emulate and boost performance with single threaded code using multi core processors.

What it really comes down to is, are the developers really up to the possible monumental task before them, in 2008, developers are finding it tough harnessing the power of Xenon's 3 CPU's, and Cells 7 CPU's
(looking at you valve)
, in 2010, how on earth are they going to use the 16-24 cores in Intel's Larabee , by 2015 we will probably have CPU's with hundreds of cores, chances are the majority of them will be sitting idle, while Microsoft Office 2013 is taking ten minutes to load a fucking spreadsheet.

But in all honesty I really don't have a clue where technology is going at the moment, but one thing is clear, while the developers and Customers are getting frustrated, the hardware vendors, and the publishers will still be making a shit ton of money.:D
 

RobertM

Member
DCharlie said:
Although PC -tech- itself is now moving very very fast. I only have to look at the machine i bought 3 months back compared to what i can get now for the same money and the difference in performance is pretty stark.

Though it's hamstringed by some issues with Vista, i think 'horrible' is a wee bit ott.
Yes, but the programming side of things is not really improving or optimizing to the same degree.
 

Ptaaty

Member
MikeB said:
Back in the 80s operating system bootup times were near instant or at most it took only a few seconds to load a full blown mouse controlled desktop environment from the harddrive. With all the technology advancements I envisioned I imagined computing to have become near instant just like turning on a TV, actually I thought turning on the computer would allow you to instantly continue your work at the point you turned off the system. So if you would be painting a picture, turn off the system, then turn it on again you should just be able to continue where you left off.
...

Suspend to Ram, S3 or STR3, look it up...you can already do what you want on almost any mobo....instant on and continuation.
 

MikeB

Banned
DCharlie said:
Although PC -tech- itself is now moving very very fast. I only have to look at the machine i bought 3 months back compared to what i can get now for the same money and the difference in performance is pretty stark.

Though it's hamstringed by some issues with Vista, i think 'horrible' is a wee bit ott.

I think from both a hardware and software perspective it's still moving forward very slowly (especially considering the amount of employees and resources today's leading desktop OS companies have to their disposal compared to past companies), most of this caused by lack of desktop operating system efficiency, lack of developing ambition to start over with a clean perspective and a lot of x86 legacy bagage.

I haven't used Vista in a while, but with for instance AmigaOS this OS was far more open towards the power user (while also being console-like easy for the casual gamer), if a system file corrupted I could easily find out which file was affected, bootup from diskette or CD into a GUI or CLI environment to replace the file (with Windows I am forced towards a complete re-install). OS and applications were smarter, for example if a new image format was released it was as easy as installing a datatype (moving the file into the datatypes directory/folder is sufficient, a noob of course just launches an automatic install program), like as the case with locale files in locale directory, a new printer driver in printer directory, etc , etc, very well structured file names relating to the function of the file, no 1.ddl, 2.ddl or such crap, but bsdsocket.library, dutch.locale, PNG.datatype, etc. So easy to memorize, understand and find relevant files) and all old programs handling pictures like word processors, paint programs, presentation software, etc would recognize the new format, the same goes for for instance new sound and video formats.

The system could display different resolutions simultaneously for huge amounts of different screens, good for testing how for instance your website looks like in different resolutions with the the click of the screen button or by dragging screens. Some early graphics cards supported this feature but as Windows became dominant this hardware/OS feature got lost (so also for AmigaOS4 using more 'modern' graphic cards). The mouse pointer even on the oldest of Amigas you could always move and pressing buttons and such always provided the instant user feedback nomatter how many stuff you were multi-tasking.

The icon system was a lot more powerful to the power user as well, this by the usage of tooltype commands in the icon information where you could easily specify things like window size, resolution if you set it up to open a new screen, overwrite application language or change any other parameter the programmer deems useful of supporting. Installing a new language was as easy as installing a new locale file, you could easily change the OS language on the fly and specify preferred languages for applications. For instance 1) Dutch if the application locale is not available, 2) English, if not available, 3) German and so on. Just scratching the surface here with regard to desirable features. There are so many features, abilities and good ideas gone lost I grew up becoming used to in the 80/early 90s... Back then I felt in total control of my desktop environment and could easily adjust things to suit to needs and preferences.

Of course there have been significant hardware advancements, but IMO modern desktop OSes are lacklustre in many ways compared to my expectations for the future as a kid.
 

MikeB

Banned
Ptaaty said:
Suspend to Ram, S3 or STR3, look it up...you can already do what you want on almost any mobo....instant on and continuation.

I think my laptop supports this, but it's nowhere close to instant. My laptop goes into a harddrive rattling syndrome and it takes quite a while for it to finish.

On the Amiga I used a very usefull utility called SnoopDOS, which registers all file activity, as well as what tooltypes a program is checking, what fonts, libraries and devices are being loaded, and so on. I tried an equivalent utitliy for Windows inspired by SnoopDOS and it was pretty much useless to me, so unbelievable much activity even when the system seems completely idle, moving files back and forth all of the time for no apparent reason, couldn't find out what most of the files were used for anyhow, so removed the utility looking at all the seemingly useless activity only serves to give the user a headache.
 

MikeB

Banned
Some things Mike Acton states within these interviews made me remember some lenghty discussions I had within the Amiga community a long time ago regarding lost developer knowledge and competence.

Mike Acton states:

They use a high-level or compiled language, and it’s like a magic box to them. But it's something that as a professional programmer you should know - it should be part of the job description - and I think fundamentally what's missing is an understanding of hardware and how it works and how it fits into the programming ecosystem. So maybe what they should be blending is an electronic engineering degree along with a computer science course.

It's interesting, because I think that probably the oldest programming methods are the most relevant today. It's the habits over the last five or eight years that are struggling, and it's interestingly the people that are more recently out of school that are going to have the most trouble, because the education system really hasn't caught up to how the real world is, how hardware is changing and how development is changing.

To quote myself from before the PS3 launch:

BTW, I've never stated that many aspects of Microsoft's development environments aren't powerful, however one thing I dislike the most is that many supported features are largely tied to Microsoft's systems, Direct X, C#, Visual Basic, XNA, all more or less incompatible with independent internationally used industry wide standards. More common toolsets, standards and programming languages would IMO have benefitted everyone working within the computer industry (more jobs, more progress, more competition, etc). IMO some dev options have the danger of making potentially talented developers become lazy or competence-wise too tied to one platform / toolset and maybe in some cases unable to truly comprehend how a computer really functions underneath the hood. IMO, if not grown up with dev environment like XNA or Easy AMOS, the more talented developers will be able to master the Cell.

Other PS3 related comments from early 2005:

I understand that games developers may not be too fond of the idea of having to learn new ways to write their software.

Multithreading is mainly useful in a multi-CPU environment. One OS which has been designed with multithreading in mind is BeOS. The initial BeBox prototypes had two AT&T Hobbit processors and three DSP's, later version came with two PowerPC 603 processors running clocked at 66MHz or 133Mhz.

Today single CPU solutions are dominant, thus multithreading isn't really that much of a benefit. Software developers who mainly write for single processor solutions don't like doing extra (time=money) work to get the most out of multithreading for other platforms.

It will probably take some time before developers manage to get the most out of this platform, as has for example also been the case with the classic Amiga chipsets. The early Amiga games don't compare well to the complex graphics used by for instance game like Elfmania or Lion Heart.

Some interesting perspectives from an Amiga buddy in reply to the first comment from me above:

We programmers tend to become pretty spoiled now a days, especially the senior generation which claims to be part of the first generation of developers. People in my age were at the front line when gaming was born. They had to hit the HW and come up with some creative ways to get the max out of it. Over time this "species" (the original game developers) have evolved, I would say into an spoiled elephant.

Now there's middle ware. Who would go with their own thing any more. At the end you get your salary (not small, BTW). Don't waste a dime and get the max out of it. Who has time to investigate into new technology ? We would earn less then say 100K a year. Oh darn.

I don't say its not expensive and the costs exploded over time, seriously. Exploring a CELL (which I consider the most expensive piece of HW from a SW developer POV) costs a lot, and might take a couple of month/years to really get something out of it. Especially because it requires a complete mental paradigm shift.

Now you sit there and your whole evolution (dictated by the big MS or others) pushed you into a linear way. OOP, C++, well, you could adopt a little to think a bit more parallel - they told you since years that MHz are not scalable. But sometimes history happens in a different timescale than the natural human is used to. It happened faster. All of a sudden the market is full with multi cores. Well, OK, it was hard enough to handle that step, but now, Sony (oh you ####s!) come up with an async parallel machine ? How dare you are! We just started to adopt, and now this ? What the heck!

I don't want to bash on XNA here (I guess I do), I just want to point to a generic problem we have in the industry. I hired a guy recently (well, 2 years ago) because he wasn't bad in the interview. He had one year of experience. We were looking for an entry level programmer. He had a CS BA and doing his Masters which he got last summer. Based on that information you would expect something. But he couldn't program. He left a couple of weeks ago - I probably should have done more code reviews and fired the guy before (OTHO this was a more convenient way to handle things). Sorry to say so. But it is hard to find people who understand the fundamentals now a days. And by spoiling the kids even more with even better dev tools nobody cares to look below the surface any more. What a pitty.

While it is nice, that over all development costs can be lowered by even better tools, its the knowledge base below that fades away. This becomes pretty visual if something (new technology) happens outside the mainstream boundaries (such as the PS3, IMO).

Another reason is, that the constant cost pressure does not allow you to break away from the mainstream that much. It requires someone like Sony - someone with the really huge bucks - to be able to at least give it a try. And even they are on the edge of a failure (what does this tell us regarding the Amiga situation, heh ?).

Maybe thats another reason why I'm supporting the PS3
 
"Blah blah blah...
... XNA-raised entry-levels won't ever be competend enough to work at a lower level ...
blah blah blah..."

What a silly arguement..
 

MikeB

Banned
archangelmorph said:
"Blah blah blah...
... XNA-raised entry-levels won't ever be competend enough to work at a lower level ...
blah blah blah..."

What a silly arguement..

Here an interesting observation by a fellow tech writer attending a XNA PR event:

Is XNA just another hobbyist game programming tool, likes others including AMOS and STOS (for the Amiga and Atari ST), or more recently DarkBASIC? There is a range of opinions here. Peter Molyneux enthused to me about XNA, saying it is “a fast, efficient, better language” than earlier more compromised tools, and “built with ambition in mind,” so that XNA programmers can create the next Populous (the game which made Molyneux his fortune).

On the other hand, Molyneux undermined his evangelism by also stating that C++ remains the language of choice for professional development. If this is the case, then XNA will always be a hobbyist niche, which implies that serious students of game programming should not waste too much time on it.

http://www.itwriting.com/blog/?m=20061215

Personally I think professional programmers should first learn how computers really operate underneath the hood, this as a good basis. People driving in cars don't per se know why exactly their car broke down.

Developers who may know everything about the capabilities of the UT3 engine, but still have very little in-depth knowledge regarding the underlying hardware.

Don't get me wrong I'm all for easy to use middleware and simplified development environments for hobby developers, like Blitz Basic (even the hugely succesful Worm series was originally created with this by an Amiga bedroom coder), Amos or XNA.
 

MikeB

Banned
archangelmorph said:
"Blah blah blah...
... XNA-raised entry-levels won't ever be competend enough to work at a lower level ...
blah blah blah..."

What a silly arguement..

Just noticed you created a 2D XNA puzzle game. The comments above weren't meant to be degrading or anything. There are quite a few BlitzBasic and Amos games (usually freeware or shareware games) I liked for the Amiga as well. However those weren't the most demanding games.

Don't you agree XNA doesn't allow you to get the maximum out of the 360 (also being dependent on the tools low-level XNA devs provide you with) and you will run into troubles if you wanted to port your game to Amiga, Macintosh or PS3 (or embedded solutions like mobile phones)?
 

Schrade

Member
MikeB said:
On the Amiga I used a very usefull utility called SnoopDOS, which registers all file activity, as well as what tooltypes a program is checking, what fonts, libraries and devices are being loaded, and so on. I tried an equivalent utitliy for Windows inspired by SnoopDOS and it was pretty much useless to me, so unbelievable much activity even when the system seems completely idle, moving files back and forth all of the time for no apparent reason, couldn't find out what most of the files were used for anyhow, so removed the utility looking at all the seemingly useless activity only serves to give the user a headache.
SnoopDOS is the most amazing app ever. I LOVED that thing. There are some tools for Windows that are somewhat similar but nowhere near as easy to use, configurable or capable as SnoopDOS.

Also with regard to the AmigaOS capabilities - It's amazing how we had such an advanced operating system from 1985 and yet still today there are so many things not present in the "modern" OSes.

I hope the Amiga-like OS (MorphOS?) becomes commonplace someday.
 

MikeB

Banned
Schrade said:
SnoopDOS is the most amazing app ever. I LOVED that thing. There are some tools for Windows that are somewhat similar but nowhere near as easy to use, configurable or capable as SnoopDOS.

Also with regard to the AmigaOS capabilities - It's amazing how we had such an advanced operating system from 1985 and yet still today there are so many things not present in the "modern" OSes.

I hope the Amiga-like OS (MorphOS?) becomes commonplace someday.

AmigaOS has seen much development neglect for many years, after Commodore bankrupted due to huge PC branch losses at a time when Amiga captured the vast majority of the CD market in Europe (bigger market share than CDi, PC CDROM and SegaCD combined) and a big inventory of unsold goods in Asia intended for the US launch of a new game console which got blocked by US court due to a dubious patent issue (patent fees were demanded while Commodore ran out of money, selling the huge unsold stock would have allowed them to pay these fees).

Due to the Amiga's huge popularity at the time lengthy battles followed to acquire this IP, it took very long but eventually Escom (which bankrupted soon after) and then Gateway bought this IP. Gateway was serious about developing new Amiga systems, but Microsoft blocked this by demanding more money for Windows licenses if Gateway would introduce a rival system onto the market and considering Gateway was selling millions of PCs a year the project was not viable for them and had to be cancelled. Amiga IP was again sold to a small company which includes a few ex-Gateway employees.

This company became Amiga Inc which was mainly developing new cross-platform embedded technologies but did not have the resources to continue a desktop targetted platform and this was thus outsourced to Hyperion Entertainment which hired some very talented developers which together with Eyetech introduced AmigaOne developer systems and recently released an end user version for upgraded classic systems.

Both Amiga Inc and Hyperion want to port AmigaOS4 to the PS3 and develop new computers which could become a very enjoyable hobby / enthusiast operating system for the short term and the PS3 would allow for a cheap mass available platform for interested users. Sadly they are currently fighting a huge legal battle over the IP rights which prevents this port to happen (there's an unofficial incomplete Mac Mini port floating around though).

MorphOS is very similar in structure and design compared to AmigaOS (even compatible with 3.x through CPU emulation), under the hood however it's IMO just an AmigaOS3.x re-implementation with much new high-level stuff running on top. Disputes and infighting resulted the project to become purely a spare time hobby project.

AROS is a rather similar open source variant, but IMO far less complete than MorphOS. Again AmigaOS 3.x level tech under the hood (basically 1992 tech) with improved high-level replacement components also available for Amiga OS3.x. Again a non-commercial hobby project.

IMO to gain any form of attention, the power of the Amiga brand is needed as well as full time commercial development like AmigaOS4.0 received over the last couple of years. So let's hope the court battle is over soon enough.
 

belvedere

Junior Butler
Insignificant but interesting comment from Terminal Reality.

http://www.n4g.com/ps3/NewsCom-140033.aspx?CT=2&Page=1&Page2=1#C1035730

Speaking to videogaming247 at the Sierra Spring Break 08 in Mallorca last week, Terminal Reality president Mark Randel admitted that Ghostbusters on PS3 has been held back by the fact will also release on 360, saying that the game would have double the amount of objects on screen if it had been PS3-only
 

belvedere

Junior Butler
carlosp said:
this is nothing too surprising. If you optimize your code for the SPEs you can easily do alot more then what you can do with the 360. We have experienced great potential for the PS3 and the SPEs.


Care to elaborate?

:D
 

SRG01

Member
carlosp said:
this is nothing too surprising. If you optimize your code for the SPEs you can easily do alot more then what you can do with the 360. We have experienced great potential for the PS3 and the SPEs.

Who are you??

???
 

carlosp

Banned
SRG01 said:
Who are you??

???

someone who works in a gaming company which works on some ps3 projects. I hope you understand the fact that i don't want to interduce my self and my company here :)

@Care to elaborate?:

I am sorry but at this stage everything is too technical. I am not a dev guy my self, for that reason i am the wrong one to explain the results. I can tell some more in a few weeks when we get the first visual results.
 
Top Bottom