• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

GDC '08: LucasArts prioritizing PS3, Unleashing Force

proposition said:
The R600 and Xenos share the same fundamental architectural paradigm.

Can you elaborate ? Or were you referring to "unified" shaders ? I was under the impression R500 was a wholly unique design not closely related to R520, R580, or R600.
 
MoNaRky said:
Even Xbox 360 was not able to take full advantage of Unified Shader Model Architecture until DX10 was finalized and that is why Gears was their first great looking game. Why they then could display 1080P. But Xenos still does not support full DX10 features because Microsoft had just begun DX10 development just prior to the GPU being finished.

Was that really directly related to DX10 though? I was under the impression that the root problem was the load balancing/scheduling mech the 360 used for shader allocation kind of sucked. It was leaving plenty of idle time, killing efficiency.

I would hope they were in the background reworking that mech regardless of DX10's then-current state.

But yes MGS4 and Killzone II are the only Developers involved in Beta testing MT Evans DX10 features. With Killzone II it should be obvious that RSX is DX10 capable. People should have realized that when they announced they were doing Deferred Rendering in combination with MSAA for the Demo last summer. Only PC GPU's are capable of doing that like G80 series cards. Where the capability to do this is required by Microsoft's own DX10 Spec Standards. Xenos is not capable of running this game or that spec.

As has been stated, there has always been a disconnect between HW capability of GPU's, and the current DX model. To be compliant, a card has to support the set of features prescribed by DX, but that doesn't mean it doesn't actually support a super-set.

Just because RSX supports some features beyond DX9, does not imply it is fully DX10 compatible at all.
 
MoNaRky said:
But yes, Unified Shader Model is in RSX and it has many secrets that are going to be revealed soon. Notice Sony never said RSX was less than it's near 2TFLOPS performance by itself and it was everyone else insisting it was a G70. Sony never said it was or wasn't. How do you think they got that number though? Made it up? No, and near 2TFLOPS for RSX alone isn't NVFLOPS either!

Even Xbox 360 was not able to take full advantage of Unified Shader Model Architecture until DX10 was finalized and that is why Gears was their first great looking game. Why they then could display 1080P. But Xenos still does not support full DX10 features because Microsoft had just begun DX10 development just prior to the GPU being finished.

But yes MGS4 and Killzone II are the only Developers involved in Beta testing MT Evans DX10 features. With Killzone II it should be obvious that RSX is DX10 capable. People should have realized that when they announced they were doing Deferred Rendering in combination with MSAA for the Demo last summer. Only PC GPU's are capable of doing that like G80 series cards. Where the capability to do this is required by Microsoft's own DX10 Spec Standards. Xenos is not capable of running this game or that spec.

Results? MGS4 and Killzone will set the standard for games on PS3 to follow in years to come!

Now ...... for you clowns who flunked out of third grade. With the Mentalities of the dude with his head stuck up his *** picture posted recently, just because I just joined NeoGAF doesn't mean I'm some ignorant NOOB like you. Since you (in the post following) have obvious trouble "duh.... ciphering" facts and information when it's put in your face, maybe you need to go out and get yourself a real "edgimacation" so you can make more intelligent comments! :lol

Posting shit like the RSX supports the Unified Shader Model makes you a ignorant, NEWB. Let me make one thing very clear to you, flunky. The RSX does not support the Unified Shader Model. Please get that through your fucking head. How do I know this? Well, if the RSX was really capable of Unified Shaders then it would have been utilized by the top first party developers long ago. Care to prove me wrong? Post your sources.
 
Truespeed said:
Posting shit like the RSX supports the Unified Shader Model makes you a ignorant, NEWB. Let me make one thing very clear to you, flunky. The RSX does not support the Unified Shader Model. Please get that through your fucking head. How do I know this? Well, if the RSX was really capable of Unified Shaders then it would have been utilized by the top first party developers long ago. Care to prove me wrong? Post your sources.

Holy hostility, batman! :lol
 
Why the f**k are people talking dx10 and the PS3 I mean it doesn't use DX also isn't the
360 GPU a crippled dx10 part as it doesn't have all the features required.

RabG.
 
RabG said:
Why the f**k are people talking dx10 and the PS3 I mean it doesn't use DX also isn't the
360 GPU a crippled dx10 part as it doesn't have all the features required.

RabG.

You aren't writing a letter, you don't need to put your "name" at the end. Look at the left side of the screen, it shows people usernames and avatars.
 
WrikaWrek said:
You aren't writing a letter, you don't need to put your "name" at the end. Look at the left side of the screen, it shows people usernames and avatars.

RabG Seal of Authenticity / Quality / Emphasis. I think it's cute.
 
Death Dealer said:
Can you elaborate ? Or were you referring to "unified" shaders ? I was under the impression R500 was a wholly unique design not closely related to R520, R580, or R600.
Exactly, the unified shaders. It's like the difference between a Wankel engine and a regular piston engine. Even if two implementations of the Wankel engine aren't exactly the same, they're still not piston engines.
 
TEH-CJ said:
So about that star wars game...

I wish limb-removal would make it into the Star Wars series. Taking off a stormtrooper's legs with a lightsabre would be pretty good, or perhaps tearing them apart with Force powers. Needn't bother about the blood either, if the sabre's heat seals the wound. Although, Ninja Gaiden quantities of blood in zero-G could be alright.
 
TEH-CJ said:
So about that star wars game...

It looks pretty nice.

2hgb97t.jpg

bhwpjm.jpg

o0w9e9.jpg

35l9dgl.jpg

1z3t921.jpg

5jxruh.jpg
 
News flash: CELL IS THE FUTURE

Game console CPUs are going to be more like Cell from here on out. You can pretty much count on that.

Meanwhile, games publishers have to pour gobs of cash into engines and technologies. They hope to get some mileage out of this investent.

Now. If you have technologies that run great dont run so great on Cell, you're going to be way behind the curve when most or all console cpus are like Cell. Especially in the presence of companies like Lucas who managed to use some forethought in the matter.
 
Salazar said:
I wish limb-removal would make it into the Star Wars series. Taking off a stormtrooper's legs with a lightsabre would be pretty good, or perhaps tearing them apart with Force powers. Needn't bother about the blood either, if the sabre's heat seals the wound. Although, Ninja Gaiden quantities of blood in zero-G could be alright.

Needless to say, dismemberment does happen in the novels :lol
 
Crayon said:
News flash: CELL IS THE FUTURE

Game console CPUs are going to be more like Cell from here on out. You can pretty much count on that.

Meanwhile, games publishers have to pour gobs of cash into engines and technologies. They hope to get some mileage out of this investent.

Now. If you have technologies that run great dont run so great on Cell, you're going to be way behind the curve when most or all console cpus are like Cell. Especially in the presence of companies like Lucas who managed to use some forethought in the matter.

This is also true.
 
Onix said:
Works been a bitch ... what can I say :(

Okay ... I fail. Is that better? :p






I have two friends with it, so I've seen it before ... just not on my god-like HT :D

You've redeemed yourself by playing Uncharted, that's all that matters.
Indifferent2.gif
 
Onix said:
Works been a bitch ... what can I say :(

Okay ... I fail. Is that better? :p






I have two friends with it, so I've seen it before ... just not on my god-like HT :D

Yeah, the game looks great, and no you do not fail,at least you bought it :D
 
Truespeed said:
Posting shit like the RSX supports the Unified Shader Model makes you a ignorant, NEWB. Let me make one thing very clear to you, flunky. The RSX does not support the Unified Shader Model. Please get that through your fucking head. How do I know this? Well, if the RSX was really capable of Unified Shaders then it would have been utilized by the top first party developers long ago. Care to prove me wrong? Post your sources.

Why in the world would I even care what you think, fool. No I don't need to throw any links at you, because way deep inside the perverted recesses of that mind of yours, in the abscessed deluge of ignorance, is the possible recognition that the RSX is first and foremost not a G70 Series chip either then!

Now you got your links right to prove me wrong? Of course. Where are they? .....and I mean proof not ass-umptions based on "he said she said" written on the marbles rolling around in your head!

First off you'll need to prove that Nvidia and Sony took a G70 with a chip core base clock of 400MHz and proceeded to over clock it to 550MHz (which just so happens to be closer to an 8800 GPU's core clock).

Or they took a factory Overclocked BFGTech GeForce 7800 GT OC H2O (with cooling solution pulled and tossed in the can), that was already clocked at the highest clock from any factory at 470MHz and doubled it's over clock rate! ...then expected it to live for 10 years clocked 40% over it's base clock, for that G70 chip to be an RSX.

Where are your cooling solutions, man? ....in the trash won't fit on PS3....LOLz Then they added even more heat production to RSX with the GDDR3 modules under the heat spreader of a 1200+ pin substrate. That under you peoples assumptions would be leading to no where inside that 7800 chips silicon interior?!?! ....duh

Do you know how logic works in the substrate network layers? That coincidentally connect it to it's silicon? Go study "Chip Substrates" and come back when you can logically carry on a decent conversation on how everything has to connect to something in the structure of a die on silicon. LOLz

http://www.hardocp.com/article.html?art=ODU1LDEsLDA=

The highest "STABLE" (24/7) clock ever achieved on a G70 series GPU isn't over 500MHz!

Yes a relative few using custom over clocking utilities have got as high as 512MHz for very limited amounts of time, but you just blew your Warranty by doing that!

Have you ever seen a GPU or CPU chip fry? They explode! Not much left.....and if you are more comfortable with believing that they are pumping more than enough voltage in, to disintegrate the silicon of a G70 chip (that you think the RSX is), over it possibly being a Unified Shader Architecture, go right ahead! ......THE PS3 W/RSX CAME OUT AFTER G80 LAUNCH!......not like Xenos a year and a half before ATI finally got their heat monster PC Unified Shader Model Architecture in stores (over half a year after both 8800 and RSX were sitting on shelves, fool)!

Nvidia and Sony started Research and Development on RSX and G80 Unified Shader model at the same time in 2002. Nvidia announced Design and Prototyping had been completed for their Unified Shader Model just before Xbox 360 launched in November 2005. Then said it wasn't completed. Then claimed they weren't working on Unified Shaders for anything (weren't needed), and then announced Design of RSX was completed in March of 2006. That same month Kutaragi announced Final Dev Kits would begin shipping in June 2006. So design of the RSX took 4+ years and ATI took just 2+ years to design and engineer Xenos (and it shows w/record RRoD's - X clamp)???

But just to make you happy, I've changed my mind and we'll just continue to believe ATI or M$ messed up with that fast n trash design that has RRoD'd 30% of the Xbots Xbox's.

Then us PS3 owners can claim, "Hey man, check out my PS3 with the Massively Overclocked G70 RSX". As we continue with big fat smiles on our faces, "We all run our PS3 24/7 at 40% Over Clock and they haven't Fried Yet!!! ...... woo...hoo....We be ruling the "Over Clockers Club" with 10 M_i_l_l_i_o_n Members Having a Party!" :D ....and when Killzone II arrives running DX10 Full Features we can all claim we have the only DX10 compatible, OVER CLOCKED G70's in Existence! :lol
 
If you really want to know what the RSX is and is capable of without any fantboy blinkers or bullshit articles, I reccommend you got to Beyond3d. There you can read articles from people working on the RSX and people who have worked on the RSX. Really much of what you posted is incorrect, even the clock speed by the way which also something you should read up on.

MoNaRky said:
Why in the world would I even care what you think, fool. No I don't need to throw any links at you, because way deep inside the perverted recesses of that mind of yours, in the abscessed deluge of ignorance, is the possible recognition that the RSX is first and foremost not a G70 Series chip either then!

Now you got your links right to prove me wrong? Of course. Where are they? .....and I mean proof not ass-umptions based on "he said she said" written on the marbles rolling around in your head!

First off you'll need to prove that Nvidia and Sony took a G70 with a chip core base clock of 400MHz and proceeded to over clock it to 550MHz (which just so happens to be closer to an 8800 GPU's core clock).

Or they took a factory Overclocked BFGTech GeForce 7800 GT OC H2O (with cooling solution pulled and tossed in the can), that was already clocked at the highest clock from any factory at 470MHz and doubled it's over clock rate! ...then expected it to live for 10 years clocked 40% over it's base clock, for that G70 chip to be an RSX.

Where are your cooling solutions, man? ....in the trash won't fit on PS3....LOLz Then they added even more heat production to RSX with the GDDR3 modules under the heat spreader of a 1200+ pin substrate. That under you peoples assumptions would be leading to no where inside that 7800 chips silicon interior?!?! ....duh

Do you know how logic works in the substrate network layers? That coincidentally connect it to it's silicon? Go study "Chip Substrates" and come back when you can logically carry on a decent conversation on how everything has to connect to something in the structure of a die on silicon. LOLz

http://www.hardocp.com/article.html?art=ODU1LDEsLDA=

The highest "STABLE" (24/7) clock ever achieved on a G70 series GPU isn't over 500MHz!

Yes a relative few using custom over clocking utilities have got as high as 512MHz for very limited amounts of time, but you just blew your Warranty by doing that!

Have you ever seen a GPU or CPU chip fry? They explode! Not much left.....and if you are more comfortable with believing that they are pumping more than enough voltage in, to disintegrate the silicon of a G70 chip (that you think the RSX is), over it possibly being a Unified Shader Architecture, go right ahead! ......THE PS3 W/RSX CAME OUT AFTER G80 LAUNCH!......not like Xenos a year and a half before ATI finally got their heat monster PC Unified Shader Model Architecture in stores (over half a year after both 8800 and RSX were sitting on shelves, fool)!

Nvidia and Sony started Research and Development on RSX and G80 Unified Shader model at the same time in 2002. Nvidia announced Design and Prototyping had been completed for their Unified Shader Model just before Xbox 360 launched in November 2005. Then said it wasn't completed. Then claimed they weren't working on Unified Shaders for anything (weren't needed), and then announced Design of RSX was completed in March of 2006. That same month Kutaragi announced Final Dev Kits would begin shipping in June 2006. So design of the RSX took 4+ years and ATI took just 2+ years to design and engineer Xenos (and it shows w/record RRoD's - X clamp)???

But just to make you happy, I've changed my mind and we'll just continue to believe ATI or M$ messed up with that fast n trash design that has RRoD'd 30% of the Xbots Xbox's.

Then us PS3 owners can claim, "Hey man, check out my PS3 with the Massively Overclocked G70 RSX". As we continue with big fat smiles on our faces, "We all run our PS3 24/7 at 40% Over Clock and they haven't Fried Yet!!! ...... woo...hoo....We be ruling the "Over Clockers Club" with 10 M_i_l_l_i_o_n Members Having a Party!" :D ....and when Killzone II arrives running DX10 Full Features we can all claim we have the only DX10 compatible, OVER CLOCKED G70's in Existence! :lol
 
MoNaRky said:
Why in the world would I even care what you think, fool. No I don't need to throw any links at you, because way deep inside the perverted recesses of that mind of yours, in the abscessed deluge of ignorance, is the possible recognition that the RSX is first and foremost not a G70 Series chip either then!

Regardless of whether you care what we think or not, making claims like this without any substantiation leads to banning.

Beyond that, why would assuming RSX is indeed not G70, mandate it to have unified shaders?

Now you got your links right to prove me wrong? Of course. Where are they? .....and I mean proof not ass-umptions based on "he said she said" written on the marbles rolling around in your head!

Haven't devs, etc. duscussed the numbers of pixel and vertex shader in RSX's pipeline? Why would they lie?

THE PS3 W/RSX CAME OUT AFTER G80 LAUNCH!

Actually, it didn't. nVidia didn't manufacture it, Sony did ... and had the design well before G80. Beyond that, the PS3 was delayed due to problems with Silicon Images HDMI Tx, and blue diode production yields. They didn't send back the RSX design to nVidia for retooling during this delay.
 
MoNaRky said:
But just to make you happy, I've changed my mind and we'll just continue to believe ATI or M$ messed up with that fast n trash design that has RRoD'd 30% of the Xbots Xbox's.

Then us PS3 owners can claim, "Hey man, check out my PS3 with the Massively Overclocked G70 RSX". As we continue with big fat smiles on our faces, "We all run our PS3 24/7 at 40% Over Clock and they haven't Fried Yet!!! ...... woo...hoo....We be ruling the "Over Clockers Club" with 10 M_i_l_l_i_o_n Members Having a Party!" :D ....and when Killzone II arrives running DX10 Full Features we can all claim we have the only DX10 compatible, OVER CLOCKED G70's in Existence! :lol

bia5at.jpg


Ok. We get it mods, ahahaha, really funny!!!

Who is it? You guys are baiting us! Why are you mods so cruel to us?! Going undercover, ah! You don't fool me!
 
Pug said:
If you really want to know what the RSX is and is capable of without any fantboy blinkers or bullshit articles, I reccommend you got to Beyond3d. There you can read articles from people working on the RSX and people who have worked on the RSX. Really much of what you posted is incorrect, even the clock speed by the way which also something you should read up on.

Apparently, he said that he comes from beyond3d.
 
I thought this whole thing about the RSX being some super secret Nvidia/Sony GPU had died almost a year ago. Guess people just don't let go, do they... Relax man, the RSX is a very nice GPU, but it's not a G80 :)
 
Top Bottom