• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA Helping Sony with PS3 GPU

xexex

Banned
interesting IGN Q&A with Nvidia plus IGN's speculation:

http://gear.ign.com/articles/571/571598p1.html


NVIDIA, Sony and PlayStation 3

An unexpected announcement and a few questions answered.
December 07, 2004 - Last night around midnight NVIDIA announced that it has been working with Sony on the GPU for Sony's forthcoming computer entertainment system. It doesn't take a genius to figure out that they're talking about PlayStation 3 here. Note that the PlayStation 3 name isn't actually official yet, though we'll continue to use it as it's basically guaranteed that's what Sony will call its PlayStation 2 successor.

The press release basically states that NVIDIA is and has been working on a GPU for Sony's system for some time now, and unfortunately not a whole hell of a lot more than that. This sort of came from left field as it sounded like Sony's Cell processor was going to be doing all of the work in its upcoming console, including the graphical duties. Still though, it would have seemed a bit odd for NVIDIA to leave ATI as the sole third-party GPU supplier for the upcoming console generation, as ATI is (unofficially and yet officially) supplying the GPU for the successor to Xbox and Nintendo's next as well.

While this does bring about a bit of confusion, there have also likely been a few sighs of relief around the industry. The PS2's obvious weak points are low texture memory and poor image quality. Coupling with NVIDIA practically guarantees the second part won't be a problem. The first is obviously still up in the air, but we don't assume Sony will drop the ball on this one again.

We had a chance to fire off a handful of questions to David Roman over at NVIDIA about the partnership. He wasn't able to answer some of our more probing questions obviously, but a few interesting things were mentioned:



Firstly, how long has NVIDIA been working with Sony on this collaboration?

David Roman: NVIDIA has been working on aspects of Sony Computer Entertainment Inc.'s next generation system for the past 2 years.

How did NVIDIA become involved with working with Sony? Which party approached whom?

David Roman: Difficult to say. We have been talking to Sony about very many different projects from the early days of starting NVIDIA.

The Xbox GPU is essentially a beefy version of the GeForce 3. Will the PS3 (or whatever it will be called) GPU be based on forthcoming desktop GPU architecture or will it be its own entity entirely?

David Roman: It is a custom version of our next generation GPU


Up to this point, due to its proclaimed power, we had assumed that the Cell processor would be doing all of the processing in the PS3, both the generalized (AI, physics, etc.) and all of the graphical work. Will NVIDIA's GPU work in the terms that we're currently used to and handle the graphics entirely, or will it work with Cell in ways that current GPUs don't and let Cell handle some of the work (say, vertex transformation, for example)?

David Roman: I don't have that information. I know that this is a custom chip and we are working on its development with Sony Computer Entertainment

Will NVIDIA's GPU work be tied into the Cell architecture, or will it be a separate chip in the PS3?

David Roman: It will be a separate chip

S-Mart or Quickie Mart?

David Roman: …



Unfortunately our questions regarding such things as image quality, HD support and backwards compatibility couldn't be commented on at this time, but we sort of assumed that going in. At any rate, a few answers bring up a few interesting points, which only lead to more interesting questions.

Firstly, if NVIDIA's GPU for PS3 is a custom version of its next-gen chip, then SLI (or dual chip) is practically a shoe-in. We'd put a few bucks on the line that says there will be at least two NVIDIA branded GPUs in PlayStation 3. The interesting thing would be to see more than two, but that's for future investigation.

What we're not sure to what to make of here is how the Cell processor and NVIDIA-based GPU will work together. It's been said that the Cell processor is to be unbelievably fast, and if so it may outpace NVIDIA's GPU by quite a bit, meaning that the GPU will be the bottleneck in the system. Or alternatively, NVIDIA's next GPU will be blazing fast, faster than we'd expect, which should make console and PC gamers alike giddy with excitement.

A third option would be that Sony's Cell processor does indeed work with NVIDIA's GPU in a different fashion than regular CPUs tend to work with graphics solutions today. What way that could possibly be, we don't know. If we had to speculate, we'd say it would be something like performing vertex transformation in Cell and the lighting and shading via NVIDIA's part. Again, we can only wait and see what the truth is.

We should find out the exact details behind all of this by E3 next year, or possibly as early as GDC or so if we're lucky. Either way, we're hoping that the hype behind Cell actually holds some weight and that NVIDIA can couple a graphics solution to it that really provides for a next-generation gaming experience
 

xexex

Banned
well, that's been their mantra since late 1999, soundwave. 1000x the performance of PS2 and 'realtime CG graphics' whatever their definition of CG graphics in realtime is....
 

Phoenix

Member
soundwave05 said:
Hardware wise I think Sony has something absolutely earth shattering cooking with PS3.

They have something "ambitious and powerful" in the design of Cell. Cell at its core is really solving the problems of scalability that all the other players such as AMD and Intel are dealing with now. Everyone is going multicore on a chip. Cell, however, looks to solve some of the complications in efficiency that go along with that.

However it is important to state that the performance notes for Cell don't necessarily correspond to the performance characteristics of the PS3. Why? Because the performance of a Cell based machine depends on how its configured (how many processors, cores, etc). If you build the theoretical 64 way Cell server you have terraflops of processing power (where some of these numbers come from). However its easy to know that the PS3 won't have that many processors in it. While we now know some very useful information regarding the potential of this architecture - its still too early to know for sure how powerful PS3 itself will be.

A similar case has been used by fan boys relating it to the EE, and they are somewhat correct. The EE/VMUs etc in workstation configurations far surpass the capabilities of what's in the PS2. So while the architecture was incredibly innovative in a number of ways, not all of that transferred to the PS2. Similarly I don't think that everything that the Cell based Power processors are capable of will transfer to the PS3. One of the ones that stands out the most is networking with 'nearby cell processors' for distributed computing processes. That's very valuable to IBM and their grid farms being developed by their friends at Butterfly.net, however it is extremely unlikely that the Cell processors in your PS3 are going to be reaching out to your neighbors PS3 or any TVs and toasters in your house to solve distributed computing problems.


Anyways, back to our regularly scheduled program.
 
I think also Sony learned a lot from the PS2.

I mean that was really a big jump to get that type of performance into a box for $300 and I think it came off a little rough around the edges for a while.

PS3 I suspect will be more refined.
 
kaching said:
CrimsonSkies just gets really cranky when people start talking about next gen hardware that hasn't been released yet, esp. if it involves suggesting the PS3 might be more powerful than the Xbox2.

I can only imagine what he will do if the Rev ends up more powerful than the Xbox 2.
 

mrklaw

MrArseFace
If you only knew how programers would feel coding for a barebones GPU like the ps2 GS.

I know how programmers feel about that. Unfortunately (depending on your position), it generally doesn't matter. The publishers stump up the money, they decide the platforms. Userbase is a big part of that decision. The developer then has to work out how best to deliver. You turn round and say 'but our programmers don't like the PS2, we get our fingers dirty' - EA takes its money elsewhere. Simple, maybe sad, but true.
 
V

Vennt

Unconfirmed Member
DCharlie said:
what were the GPU specs of the two recently canned boards?

Not sure if any details of the NV50 were released, but the NV48 was just a 20-30Mhz refresh of the current NV40 / 6800 chipset.
 
Spike said:
You really have no idea what you're talking about, do you?
Never said I did, either.
With the exception of the Xbox being able to handle 8 harware lights as opposed to the Cube's 6, there really is no difference other than the amount of memory in the units.

OK, fair enough, though I don't think that's really true, because if it were, we'd see more than we have in the way of bump-mapping, normal-mapping, etc. on the system via the games at this point. Don't you think?

And ATI has been working with Nintendo on the Revolution even before Microsoft announced they had partnered with ATI. I don't expect there to be much difference between the systems graphically. I just wish these companies would start showing something already. I want to get a taste of the next gen already. :)

Yeah, but just because something's been worked on longer doesn't necessarily mean that it will make a difference in the final output, compared to others. And, yes, I can't wait to see real next-gen stuff.
 
NV50 had little leaked about it but from what everyone knows it was going to have an innovative (or even revolutionary) approach to handling Vertex/Pixel shaders (READ: Unified Shaders).. I thought it was canned because Longhord (i.e. DXNext) was delayed.

Now, though, it looks like NV may be doing something more complicated..
 

Pug

Member
Hedgehog, although I have no idea about technical stuff and with all this bluster about CELL and Xboxes CPU. I would bet my last £ that graphically the differeneces between all the next gen machine will be small, especially as no matter how anyone paints it they are all using "custom" pc graphic solutions.
 
I have recently been corrected on a false assumption, so let me tell you what I learned on B3D:

R500 was originally planned as R400, but eventually got scrapped and moved to a forward time. Then ATI began working for MS, and renamed the project as R500, and altered their roadmap. So currently we have R420 based on R300, and next we'll have R520 based on R420 with some R500 enhancements, and finally R600 for the PC will be what can compare to the R500 in the XBOX. And yes, the R500 in the XBOX should be pretty much similar in capability to what NV50 was supposed to be. Considering nVidia does not make any huge jumps forward in the next 6 months, I'd wager XBOX graphics will be on par with PS3, it will also have the first move advantage..
 
Pug, I agree 100%...well except in Rev's case...as there is no real info to go on and Ninty's said that they don't want to compete with Sony and MS in the spec war. Perhaps, late in the generation, though, we will see some major differences.

Thanks for the info, tahrikmili. ATI's numbering scheme is confusing.
 

McFly

Member
The PS3 NV50 (or however you want to call it) will not have much to do with the previously planned NV50 IMHO:

"It will be a custom GPU, not based on an existing architecture," said Jen-Hsun Huang, speaking at an "Editor's Day" for reporters and analysts at the company's Santa Clara, Calif.-based headquarters. "It will be based on an architecture under development." Neither Nvidia nor Sony disclosed further details about the new architecture.

The custom graphics processor unit (GPU) will merge Nividia's next-generation GeForce technology with SCEI's system solutions.

He specifically talked about doing away with texture mapping and spoke of more "physics" based modeling.

Fredi
 

gofreak

GAF's Bob Woodward
tahrikmili said:
I have recently been corrected on a false assumption, so let me tell you what I learned on B3D:

R500 was originally planned as R400, but eventually got scrapped and moved to a forward time. Then ATI began working for MS, and renamed the project as R500, and altered their roadmap. So currently we have R420 based on R300, and next we'll have R520 based on R420 with some R500 enhancements, and finally R600 for the PC will be what can compare to the R500 in the XBOX. And yes, the R500 in the XBOX should be pretty much similar in capability to what NV50 was supposed to be. Considering nVidia does not make any huge jumps forward in the next 6 months, I'd wager XBOX graphics will be on par with PS3, it will also have the first move advantage..

I'm not very familiar with these roadmaps, so I could well be wrong, but I thought when we heard about the R500 taping out, it was said they would be introducing the R520 in Q1 of next year, a card "based around" the tech in Xbox2?

Also, assuming nVidia won't make any jumps forward in the next 6 months is possibly dangerous. 6 months is a long time in PC graphics cards, and they'll have the advantage of knowing what ATi have been brewing for Xbox2. They will make advances, I guess the question is how big they'll be. I'm guessing they could be big if Nvidia is counting on this as their next breakthrough architecture.
 
Also not to mention the CELL processors, which I still think are going to be the heart of the machine. You don't invest that kind of captial just for a ho-hum CPU. I think the Nvidia GPU will be the "finesse" side of things while the CELL processors will provide an incredible level of brute force horsepower.

Maybe sort of an interesting balance on Sony's part.
 

Pug

Member
Gofreak, I agree their of course will be differences but what you've always got to remember this is PR bluster. I hope Nvidia have a massive step forward lined up but being honest it wll be likely be an evolution of excisting technology wouldn't you say?
 

gofreak

GAF's Bob Woodward
Pug said:
Gofreak, I agree their of course will be differences but what you've always got to remember this is PR bluster. I hope Nvidia have a massive step forward lined up but being honest it wll be likely be an evolution of excisting technology wouldn't you say?

What existing technology? The 6xxx?

Jen Huang has said it won't be based on any existing architecture.

Both the R5xx going into Xbox2 and this PS3 GPU appear to be based on new architectures. However differences may emerge in how new or different those architectures actually are compared to what's come before, and from the potential gains NVidia may make in their extra 6-8 months. Not to mention the gains made possible by any potential refinements in the manufacturing process made in that time (i.e. higher clockspeeds etc.).
 

Pug

Member
Gofreak, technically I have no idea about the subject of graphic cards or CPU. But having worked in PR for 15 years now I can pretty much tell when the "normal" is been pushed as the "extrodinary" All this latest bluster seems like that to me.
 

gofreak

GAF's Bob Woodward
Pug said:
Gofreak, technically I have no idea about the subject of graphic cards or CPU. But having worked in PR for 15 years now I can pretty much tell when the "normal" is been pushed as the "extrodinary" All this latest bluster seems like that to me.

I'd be as wary of NVidia's PR as the next, but it's been said a few times now that this will be based on a new architecture, from the head honcho down, and it makes sense given that graphics card cycles would place the introduction of very new types of cards in the 2005/2006 timeframe. Things cycle from "breakthrough" to "refreshes" and NVidia will have been refreshing the NV30 architecture for a long time if they didn't have something new by 2006!

The R5xx marks ATi's departure from the R3xx architecture, and NVidia's NV5x will mark a similar departure.
 

rastex

Banned
Most new line of cards are based on new architectures. Tahrik is right about the whole internal renumbering of chips, both ATI and nVidia do it constantly much to the ire of their employees ;) R500 is a brand-new architecture in the way the R300 was new as well.

So all this "new architecture" hoopla ya'l are proclaiming as light from the gods is just straight up routine and PR-speak. Keep your heads together people.
 
MightyHedgehog said:
Pug, I agree 100%...well except in Rev's case...as there is no real info to go on and Ninty's said that they don't want to compete with Sony and MS in the spec war. Perhaps, late in the generation, though, we will see some major differences.

Thanks for the info, tahrikmili. ATI's numbering scheme is confusing.


Don't underestimate ArtX and Nintendo just because Iwata says something. They should have the specs to be up there.
 

Pimpwerx

Member
Kleegamefan said:
Slightly OT:here is my Dolby Digital Plus thread:

http://forums.gaming-age.com/showthread.php?t=26767

Being a Blu-ray Rom product, the PS3 will also be using Dolby Digital Plus and will have an internal DD+ encoder (this was confirmed by my contact @ Dolby Labs)

Can you say 54 discrete sound channels???
I'm a total rube when it comes to audio tech. But clarify this for me. 54 discrete channels would that be like the 5.1 systems with 5 channels and the subwoofer channel? Or are they talking about stuff like "voices" in the game. Channels counts the left and right outputs each as a seperate channel, right? PEACE.
 
CrimsonSkies just gets really cranky when people start talking about next gen hardware that hasn't been released yet, esp. if it involves suggesting the PS3 might be more powerful than the Xbox2.

It's only fitting as most Xbots are disenfrachised Sega zealots.

Apparently Sega fans are the lemmings of the new age. Destined to wander off the edge of a cliff no matter how many opportunities they're given the redeem themselves.

All is well. This just further supports Sea Manky's patented Bitter Sega Cunt Theory. :p
 

marsomega

Member
Freeburn said:
I think that is what has a lot of us thinking in the same direction.

Canning the NV48, due to it being a '20Mhz' refresh was understandable, no-one has really put forward a similar justification for the NV50 being canned, most sites that reported this seemed to add a "wtf are they up to?" or a "they must have something else but what???" element to the end of such articles, although there were a few that put forward the notion that the whole ATI/WGF relationship has affected Nvidias roadmap.

Interesting to speculate, but ultimately it's stil going to be a while before anything is really known about what they are up to.


Freeburn.

The ones dictating the next API standard (unified shader model, WFG etc..) are none other then Microsoft AND ATI. Nvidia doesn't want to support a unified shader model in hardware but rather abstractly. They tried persuade MS stay away from the technology or allow them to not implment the new API directly (hardware wise), instead they would support the new spec abstractly. In other words it would support the new unified shader model features however under the hood it would be VS and PS's doing the work. NV50 most definately was not designed with the unified shader model in mind.

Now, why would Microsoft not allow NVIDIA to have their own implementation?

Simple, its not only Microsoft controlling this new standard, it's also ATI.

Whom ever controls the standard, receives the royalties. NVIDIA certainly doesn't want to support the implementation at the hardware level since that means they will pay royalties to Microsoft and ATI. You don't cancel something like that for which you've invested millions of dollars.

Also think to yourself. Do you really think NVIDIA likes to pay royalties to ATI for everyone longhorn ready/compatible card they sell? ATI likes it though. :)

It's more likely that if the NV50 was canned, that a NV5X will resurfice with whatever was salvaged from the original. There is tons of tech in a GPU, too much to throw away.

mashoutposse said:
Why would NVidia reveal something like that while they are currently embroiled in a hard fought GPU war and are looking for every advantage?

If the speculation in this thread is on track, I'm sure that NVidia would love ATi to continue along its path of development until it is too late.

:lol You are kidding right?

No they won't. ATI is the one up there with Microsoft dictating the new standard to a spec they've wanted since R400's days on the drawing table. They are also the ones (along with Microsoft) dictating what videocard makers must do to comply with longhorn. And I'm pretty sure the reason why NVIDIA would be denied and forced to implement something they have to pay royalities for has something to do with ATI.


Keep in mind that it doesn't matter what NVIDIA does with CELL for their next GPU. If ATI is not happy with it, most likely Microsoft won't be happy with it. Through out Microsoft, and kiss the PC market good bye.
 
V

Vennt

Unconfirmed Member
Thanks marsomega, that puts events into a far clearer focus, didn't quite realise the MS/ATI relationship wrt/WGF was like that.
 
marsomega said:
Freeburn.

The ones dictating the next API standard (unified shader model, WFG etc..) are none other then Microsoft AND ATI. Nvidia doesn't want to support a unified shader model in hardware but rather abstractly. They tried persuade MS stay away from the technology or allow them to not implment the new API directly (hardware wise), instead they would support the new spec abstractly. In other words it would support the new unified shader model features however under the hood it would be VS and PS's doing the work. NV50 most definately was not designed with the unified shader model in mind.

Now, why would Microsoft not allow NVIDIA to have their own implementation?

Simple, its not only Microsoft controlling this new standard, it's also ATI.

Whom ever controls the standard, receives the royalties. NVIDIA certainly doesn't want to support the implementation at the hardware level since that means they will pay royalties to Microsoft and ATI. You don't cancel something like that for which you've invested millions of dollars.

Also think to yourself. Do you really think NVIDIA likes to pay royalties to ATI for everyone longhorn ready/compatible card they sell? ATI likes it though. :)

It's more likely that if the NV50 was canned, that a NV5X will resurfice with whatever was salvaged from the original. There is tons of tech in a GPU, too much to throw away.

I may be totally wrong but I find the first part of this explanation quite disagreeable. None of the directx standards as they are require the IHVs to pay royalties to anyone for having a pixel/vertex shader in their cards and ALUs capable of threads of both vertex/pixel shader operations should not require either of the two to pay the other, as long as they R&D their own solutions themselves.

Moreover, ATI and MS do not control DX standards by themselves - every IHV out ther (INCLUDING S3 and SGI) are represented when DX standards are established, it can't that MS was conspiring against nVIDIA to add unified shaders into the standard 'in the last minute' I'm pretty damn sure nVIDIA had enough time to develop their unified shaders solution. Maybe they weren't satisfied with it and scrapped it, or maybe they learned something from Sony they could add to it to improve it - whatever, but I think NV50 had unified shaders..
 

Izzy

Banned
Freeburn said:
Thanks marsomega, that puts events into a far clearer focus, didn't quite realise the MS/ATI relationship wrt/WGF was like that.

Neither did I. In fact, I still don't.;)
 
V

Vennt

Unconfirmed Member
:lol Me neither now, maybe I'll revisit this thread when those who know what they're talking about, know what they're talking about, and I'll have a chance of getting it :p
 

Izzy

Banned
marsomega said:
Freeburn.

The ones dictating the next API standard (unified shader model, WFG etc..) are none other then Microsoft AND ATI. Nvidia doesn't want to support a unified shader model in hardware but rather abstractly. They tried persuade MS stay away from the technology or allow them to not implment the new API directly (hardware wise), instead they would support the new spec abstractly. In other words it would support the new unified shader model features however under the hood it would be VS and PS's doing the work. NV50 most definately was not designed with the unified shader model in mind.

Now, why would Microsoft not allow NVIDIA to have their own implementation?

Simple, its not only Microsoft controlling this new standard, it's also ATI.

Whom ever controls the standard, receives the royalties. NVIDIA certainly doesn't want to support the implementation at the hardware level since that means they will pay royalties to Microsoft and ATI. You don't cancel something like that for which you've invested millions of dollars.

Also think to yourself. Do you really think NVIDIA likes to pay royalties to ATI for everyone longhorn ready/compatible card they sell? ATI likes it though. :)

It's more likely that if the NV50 was canned, that a NV5X will resurfice with whatever was salvaged from the original. There is tons of tech in a GPU, too much to throw away.



:lol You are kidding right?

No they won't. ATI is the one up there with Microsoft dictating the new standard to a spec they've wanted since R400's days on the drawing table. They are also the ones (along with Microsoft) dictating what videocard makers must do to comply with longhorn. And I'm pretty sure the reason why NVIDIA would be denied and forced to implement something they have to pay royalities for has something to do with ATI.


Keep in mind that it doesn't matter what NVIDIA does with CELL for their next GPU. If ATI is not happy with it, most likely Microsoft won't be happy with it. Through out Microsoft, and kiss the PC market good bye.

Not that I don't believe you that Nvidia is paying ATI royalties for 'Longhorn ready' cards, but I need further empirical eveidence. So, proof?
 
Funny how everyone on this board is so excited by Nvidia signing to make a GPU for the PS3.... well I guess I am as well since the whole EE was laughable and not so easy to work with huh. At least with the Nvidia GPU we will have an easier to develop for PS3 :)

Emotion Engine... heh... if Xbox shipped earlier it would have beat PS2 like a drum. Sony really does need Nvidia to compete next gen... and they sure know it :)
 
V

Vennt

Unconfirmed Member
Random_Hajile said:
Funny how everyone on this board is so excited by Nvidia signing to make a GPU for the PS3.... well I guess I am as well since the whole EE was laughable and not so easy to work with huh. At least with the Nvidia GPU we will have an easier to develop for PS3 :)

Emotion Engine... heh... if Xbox shipped earlier it would have beat PS2 like a drum. Sony really does need Nvidia to compete next gen... and they sure know it :)

You are stupid, catagorically & uniquely stupid.

It is a long time since I've called someone that.

Had it occured to you that maybe, juuuust maybe, that a good portion of the 'excitement' was due to the fact that some information is starting to trickle out about the next-gen. And not part of any "fanboy" pissing match? - The fact that this thread is 400 replys long without a flamewar is testament to that, and idiots like you just want to piss over it.

Why do I think you may be breaking established COPPA guidelines by posting here?
 

marsomega

Member
tahrikmili said:
I may be totally wrong but I find the first part of this explanation quite disagreeable. None of the directx standards as they are require the IHVs to pay royalties to anyone for having a pixel/vertex shader in their cards and ALUs capable of threads of both vertex/pixel shader operations should not require either of the two to pay the other, as long as they R&D their own solutions themselves. Moreover, ATI and MS do not control DX standards by themselves - every IHV out ther (INCLUDING S3 and SGI) are represented when DX standards are established, it can't that MS was conspiring against nVIDIA to add unified shaders into the standard 'in the last minute' I'm pretty damn sure nVIDIA had enough time to develop their unified shaders solution. Maybe they weren't satisfied with it and scrapped it, or maybe they learned something from Sony they could add to it to improve it - whatever, but I think NV50 had unified shaders..

Nope. But read around. NVIDIA on record has stated they are completely not interested the technology. Its pretty well known that ATI and NVIDIA have opposing views on the technology. Was just discussing this the other day with Pheonix. Looking around, I didn't save the link but I'm sure some threads at beyond3D still have subject matter. I'm pretty damn sure about that.

First, this is not directx.

Second unlike directx the unified shader technology is based off of ATI's technology and development. (Their input specifically will heavily reflect on ATI's work.) They will receive royalities just like Intel receives royalties for CPU's based on the x86 architecture. Why the hell do you think Intel doesn't want to move from this 20+ yearold technology? Are you pretty damn sure Intel doesn't mind not receiving a dime for an archecture they designed and introduced to the market? ATI is not giving their research and technology they are contributing to the unified shader spec for free.
 

xexex

Banned
Originally Posted by Random_Hajile:
Funny how everyone on this board is so excited by Nvidia signing to make a GPU for the PS3.... well I guess I am as well since the whole EE was laughable and not so easy to work with huh. At least with the Nvidia GPU we will have an easier to develop for PS3

Emotion Engine... heh... if Xbox shipped earlier it would have beat PS2 like a drum. Sony really does need Nvidia to compete next gen... and they sure know it


Originally Posted by Freeburn:
You are stupid, catagorically & uniquely stupid.

It is a long time since I've called someone that.

Had it occured to you that maybe, juuuust maybe, that a good portion of the 'excitement' was due to the fact that some information is starting to trickle out about the next-gen. And not part of any "fanboy" pissing match? - The fact that this thread is 400 replys long without a flamewar is testament to that, and idiots like you just want to piss over it.

Why do I think you may be breaking established COPPA guidelines by posting here?

yes he is indeed stupid. he's posted TWO posts about ATI doing the graphics for Xbox 2 and Nintendo's console. stuff that is months and years old.
 
Top Bottom