• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

John Carmack on PS4 vs. Xbox One Specs: They're 'Very Close'

stb

Member
So he hasn't actually done enough benchmarking to draw a conclusion but felt compelled to drop a sound bite that sounds like he drew a conclusion. Got it.
 
Thuway is very funny sometimes. When that whole DF ram article was the hot topic. He was copping pleas about the amount of ram to devs not mattering and developers saying they didn't care.

5 GB GDDR5 versus 7 GB GDDR5 will not yield the difference that some of you are thinking.

If I were to tell you that developers are NOT concerned with the RAM situation and have bigger things to worry about (polygon count, asset creation, learning efficient new methods of coding, relearning x86) would that at all quell the hive mind? If I were to tell you that you are better off bitching about OTHER things (upclocking the GPU, complaining about the relatively low performance of the CPU) would that quell the hive mind?

If you believe the narrative that courting indie developers, giving out free devkits, talking to developers, and putting in 8 GB of GDDR5 (with 5 available to developers AT LAUNCH) is not a gaming machine than you and I are not seeing eye to eye.

Parrots love Robert Downy JR. and Iron Man, but Cyberz pls :(.

http://www.neogaf.com/forum/showpost.php?p=73171956&postcount=677

but when the Xbox One 3gb OS reserve first came out he was singing a different tune

Do people finally believe the rumors or do they think we are all just making shit up to make the Xbox look weaker? It is a media machine first and foremost, and games player second.

http://www.neogaf.com/forum/showpost.php?p=58359297&postcount=27
 
Thuway is very funny sometimes. When that whole DF ram article was the hot topic. He was copping pleas about the amount of ram to devs not mattering and developers saying they didn't care.



http://www.neogaf.com/forum/showpost.php?p=73171956&postcount=677

but when the Xbox One 3gb OS reserve first came out he was singing a different tune



http://www.neogaf.com/forum/showpost.php?p=58359297&postcount=27


To be fair though, your still talking about a difference between GDDR5 and DDR3. 5 Gigs of GDDR5 is in fact a ton of RAM for gaming purposes. 5 Gigs of DDR3 is not the same. Whether or not it'll become an issue I couldn't tell you.
 

KageMaru

Member
So yeah why buy a gf titan when you can buy a gf 670 right..

Choosing a console is not the same as choosing a GPU. If you can afford it, of course you would get the faster GPU since you're playing the same games no matter what. However it's not the same for consoles that could have different features and exclusives.
 
To be fair though, your still talking about a difference between GDDR5 and DDR3. 5 Gigs of GDDR5 is in fact a ton of RAM for gaming purposes. 5 Gigs of DDR3 is not the same. Whether or not it'll become an issue I couldn't tell you.

It's clearly obvious he was speaking to the ram allotment and not the type. 5GB of DDR3 + ESRAM vs 5GB of GDDR5 means one is a games machine and the other one isnt? No excuses for that one.
 
Thuway is very funny sometimes. When that whole DF ram article was the hot topic. He was copping pleas about the amount of ram to devs not mattering and developers saying they didn't care.



http://www.neogaf.com/forum/showpost.php?p=73171956&postcount=677

but when the Xbox One 3gb OS reserve first came out he was singing a different tune



http://www.neogaf.com/forum/showpost.php?p=58359297&postcount=27
I'm pretty sure he was talking about that in the grand scheme of things, it was right after they announced a console but focused on non-gaming features.
 

Lynn616

Member
Ruined? Hyperbole much? Has he gotten things wrong? Yes, and so has other, even more reputable sources have. At one point in time, they were getting information on a 200mhz increase. That CLEARLY didn't happen.

But these people have been right several times as well.

How many times has he been right vs wrong?

the PS4 literally trounces the Xbox 1 in software development (to the point where developers were utilizing PS4 tools to develop games on x1, and there are even talks of 20-30-40fps vs 60fps), ''they're the same'' lol. I'm sure all those complaining about the X1 being a bitch to develop for are all delusional. They could be using similar architecture, but they sure don't have the same quality dev kits.

Developers are using PS4 tools to develop games on X1?

20fps vs 60fps?

All those complaing X1 is a bitch to develop for?

Can you give some links for this? I have not heard any of that.
 

Cidd

Member
All this arguing over Specs are pointless, If these console launch and the games are identical on both, the cheaper console will win. The fact some are forgetting that if both console launched at the announce prices the PS4 will always be the cheapest this gen. You would be crazy to think MS is the only one who will drop their price in a year.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
It was interesting to read in another thread that a PC dev was having a hard time splitting his main game loop to run across many threads/many cores. It goes to show how relatively weak the Jaguar cores are relative to a modern Intel gaming computer.

I wonder how many other PC devs are going to have a hard time breaking up their gaming code to work successfully over multiple weak cpu cores.
 

szaromir

Banned
It was interesting to read in another thread that a PC dev was having a hard time splitting his main game loop to run across many threads/many cores. It goes to show how relatively weak the Jaguar cores are relative to a modern Intel gaming computer.

I wonder how many other PC devs are going to have a hard time breaking up their gaming code to work successfully over multiple weak cpu cores.
Console devs have operated in such environment for years now. But yeah, Sony had some serious balls calling it supercharged PC atchtecture.

What thread BTW?
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
I think that was the mentioned in the Planetside 4/Oculus Rift thread.

Yes. That was it. I can't find it now.

Does anyone know how well the Jaguar cores stack up to the CPU in Xbox 360 and Cell?
 

KageMaru

Member
All this arguing over Specs are pointless, If these console launch and the games are identical on both, the cheaper console will win. The fact some are forgetting that if both console launched at the announce prices the PS4 will always be the cheapest this gen. You would be crazy to think MS is the only one who will drop their price in a year.

Games won't be identical, at least not long after launch, that's for sure.

That shouldn't stop people from enjoying the Xbone versions, not everyone needs to purchase the superior console version (as we saw this gen on both the PS3 and 360).
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Games won't be identical, at least not long after launch, that's for sure.

That shouldn't stop people from enjoying the Xbone versions, not everyone needs to purchase the superior console version.

Do you mean the one with Kinect support?
 
Do you mean the one with Kinect support?

icmmicloggtyd1kkad.gif
 

benny_a

extra source of jiggaflops
Console devs have operated in such environment for years now. But yeah, Sony had some serious balls calling it supercharged PC atchtecture.
Cerny elaborated what is meant with the supercharged PC architecture. Do you mind addressing these architecture differences in detail?
 
Do you mean the one with Kinect support?

I don't think we will see much more than Kinect Voice commands in multiplatform titles. And those can be integrated in the PS4 version as well.

So when the 360 is a bit easier to develop for than PS3 during its first cycle, John Carmack says ''it's the greatest system ever created!!'', but when the PS4 literally trounces the Xbox 1 in software development (to the point where developers were utilizing PS4 tools to develop games on x1, and there are even talks of 20-30-40fps vs 60fps), ''they're the same'' lol. I'm sure all those complaining about the X1 being a bitch to develop for are all delusional. They could be using similar architecture, but they sure don't have the same quality dev kits.

I like how this man has taken a shine to PSVita, probably lost a bet or something.

Hahaha oh wow.
 
Cerny elaborated what is meant with the supercharged PC architecture. Do you mind addressing these architecture differences in detail?

I always assumed it was in reference to the unified memory architecture and GDDR5 memory personally

Don't think he was actually talking numbers
 

mrklaw

MrArseFace
It was interesting to read in another thread that a PC dev was having a hard time splitting his main game loop to run across many threads/many cores. It goes to show how relatively weak the Jaguar cores are relative to a modern Intel gaming computer.

I wonder how many other PC devs are going to have a hard time breaking up their gaming code to work successfully over multiple weak cpu cores.

Imagine how much better that PC game would run if it could efficiently use all 8 threads in a modern intel CPU? Or are you happy for it to simply brute force it through on one or two threads?

Again, console games will improve PC games by requiring multithreading for optimal performance, and those will port real nice to PCs. You shouldn't be praising lazy devs and criticising 'weak' console CPUs for something that will benefit you directly.


Planetside 2 is a good example. The devs have said that they need to significantly rework the code for PS4, but that rework will feed back into improving the PC performance.



Console devs have operated in such environment for years now. But yeah, Sony had some serious balls calling it supercharged PC atchtecture.

What thread BTW?

Architecture is not the same as performance.
 

Thorgal

Member
Damn. this thread has turned into a war zone .

Guys relax . cant we all just enjoy the games ?

Xbox one , .PS4 , PC gamers ,and yes even Wii U gamers will all receive great games we can all enjoy regardless of power differences .

there is no need whatsoever to be constantly at each others throat .
 

szaromir

Banned
Cerny elaborated what is meant with the supercharged PC architecture. Do you mind addressing these architecture differences in detail?
What did he say exactly? That CPU and GPU share the same fast memory bus and can exchange data easily? That's the only improvement I can think of over PC. But as far as the CPU goes, it's a total joke.

LDjOc7K.png

Supercomputer.
Well that's also laughable.
 
http://i.imgur.com/LDjOc7K.png
Supercomputer.

To be fair, the term "supercomputer" is rather subjective. The XB1 is a supercomputer compared to the 360. It's certainly a super computer compared to what most people have in their homes today.

Yes, some gaming PCs are more powerful, but in the big picture people with those machines in their homes are incredibly rare.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4

LOL. It's the perfect reply gif :)

Imagine how much better that PC game would run if it could efficiently use all 8 threads in a modern intel CPU? Or are you happy for it to simply brute force it through on one or two threads?

Again, console games will improve PC games by requiring multithreading for optimal performance, and those will port real nice to PCs. You shouldn't be praising lazy devs and criticising 'weak' console CPUs for something that will benefit you directly.


Planetside 2 is a good example. The devs have said that they need to significantly rework the code for PS4, but that rework will feed back into improving the PC performance.
.

I know. It's good news but I suspect a lot of devs have gotten really used to having a massive single game thread and rely on brute forcing the game.

I also think this should be very good news for folks that have a AMD FX8-8350. If the PC devs get this right then I can see how a 8 core AMD will out perform a more expensive Intel chip.
 

benny_a

extra source of jiggaflops
What did he say exactly? That CPU and GPU share the same fast memory bus and can exchange data easily? That's the only improvement I can think of over PC. But as far as the CPU goes, it's a total joke.
http://www.neogaf.com/forum/showthread.php?t=532077
http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php
http://www.eurogamer.net/articles/digitalfoundry-face-to-face-with-mark-cerny

It's been long enough that this specific thing has been talked about on GAF, so after you've read it I think your analysis is new thread worthy.
 
http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx

AMD has this image in the link, how does this translate to the PS4/XB1 or is it an exaggerated image?
Memory_Performance_chart_v3.png

based on 3Dmark Vantage benchmarks so who knows what it means for consoles.

first footnote on the page

Based on internal ASIC power measurement data from the AMD Radeon™ HD 6490M 1GB GDDR5 @ 750MHz (engine clock)/800 MHz (memory clock) = 16.5W vs. AMD Radeon™ HD 6470M 1GB DDR3 @ 750MHz (engine clock)/900 MHz (memory clock) = 14.9W. Up to 29% performance improvement measured using 3DMark Vantage Performance scores for the AMD Radeon™ HD 6490M. AMD Radeon™ HD 6490M 3DMark Vantage Performance Scores = 1811 compared to 1401 for the AMD Radeon™ HD 6470M.System Configs: Asus M4A89GTD-Pro/USB3, CPU: AMD Phenom II 965 @ 2.3 GHz Quad Core, 4GB DDR3 1333, Windows® 7 64bit Ultimate, resolution 1280X800.

http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx#3
 
Man, I love nVidia and AMD's wonky graphs. The image itself says it's showing a 20% difference but it looks like it's almost double because of the 3D columns and the angled axis.
Yeah it's misleading, but it's expressing a truth, and not just about AMD hardware: nVidia GPUs see the same kind of performance delta between DDR3 and GDDR5.

However, that says little about real-world comparisons of the consoles because it doesn't take into account the eSRAM architecture of the One, which was designed to attenuate the gap.
 
Well first of all, I think Carmack's more concerned with his rocket company shutting down than extensively comparing the XB1 to the PS4 to please or annoy console fanboys.

Second of all, comparing a PC to a console is pretty much ridiculous. PCs don't have the financial, electrical, thermal and longevity (as a game machine) issues that consoles have to face.

Finally did the PS1 or PS2 graphical inferiority to its competitors at the time stop anyone from enjoying all the great games on both systems? What's different now?

Get a grip, people.
 

RoboPlato

I'd be in the dick
Yeah it's misleading, but it's expressing a truth, and not just about AMD hardware: nVidia GPUs see the same kind of performance delta between DDR3 and GDDR5.

However, that says little about real-world comparisons of the consoles because it doesn't take into account the eSRAM architecture of the One, which was designed to attenuate the gap.

Oh, I know there's a real gap in GDDR5 and DDR3 performance. It just seems like the GPU companies put out BS graphs for every possible thing.
 

tfur

Member
What did he say exactly? That CPU and GPU share the same fast memory bus and can exchange data easily? That's the only improvement I can think of over PC. But as far as the CPU goes, it's a total joke.

So, its about architecture. The CPU and GPU in the PS4 sharing the same fast memory is honestly huge. Its what makes the design of the Xb1 using ddr3 such a sore spot. Using ddr3 in the design does not honestly embrace the APU concept.

Intel is great and all, but unless you cache block (if you can) your code into small pieces, its not all that. This is required to get the most out of Intel cpus, since current supported memory types are too slow. This is why GPU's are so prevalent in HPC, as we need the GPU memory speed.
 

nib95

Banned
Thuway is very funny sometimes. When that whole DF ram article was the hot topic. He was copping pleas about the amount of ram to devs not mattering and developers saying they didn't care.



http://www.neogaf.com/forum/showpost.php?p=73171956&postcount=677

but when the Xbox One 3gb OS reserve first came out he was singing a different tune



http://www.neogaf.com/forum/showpost.php?p=58359297&postcount=27

Lol, his posts do not imply what they do because of the reasons you think. On the Xbox One it's a lot more than just the ram reserve. Their main emphasis was the OS/system features hence the use of DDR3 instead of GDDR5 and Kinect being forced and packed in over providing better hardware. Everything about the console, from design to hardware, including all past leaked documentation points to as much. It's very aim from the beginning was to be an all in one media machine, not just a games console. Thuway is absolutely right about this.
 

Melchiah

Member
You know what I love about the "Microsoft is playing catch up" narrative being pushed for the past few months?

At E3 Microsoft had Forza 5 and Ryse on display and they looked a hell of a lot more impressive and close to being finished than Driveclub or Knack.

Considering, that Sony showed gameplay/real-time material of four of their own games in February (Killzone: Shadowfall and Infamous: Second Son included), whereas over three months later in June, Microsoft only showed Forza, and brief teasers of the next Halo and the Black Tusk project, I'd say MS is behind in development.
 
Lol, his posts do not imply what they do because of the reasons you think. On the Xbox One it's a lot more than just the ram reserve. Their main emphasis was the OS/system features hence the use of DDR3 instead of GDDR5 and Kinect being forced and packed in over providing better hardware. Everything about the console, from design to hardware, including all past leaked documentation points to as much. It's very aim from the beginning was to be an all in one media machine, not just a games console. Thuway is absolutely right about this.

What do you mean its more than just ram reserve. He said that last quote in reference to Xbox One is a ram reserve thread. Taking 3 gigs from developers for OS functions was proof that it wasn't a gaming machine. But potentially taking 3 gigs from developers on the PS4 was not an issue because the developers said it was plenty and the differences between 5 and 7 wouldn't yield that big of a difference. How would that same principle not apply for developers on Xbox One?

I know its DDR3 vs GDDR5 but lets also keep things in perspective here, the X1's memory bandwidth increased over the 360 by the same amount that the 360 increased over the OG Xbox, and that's 3x.

OG Xbox bandwidth: 6.4 GB/s
Xbox 360 bandwidth: 22.4 GB/s = 3x increase over OG Xbox
Xbox One bandwidth: 68.3 GB/s = 3x increase over 360

So to say the Xbox One is less of a gaming machine is disingenuous unless you believe the Xbox 360 wasn't a gaming machine either.
 

RayMaker

Banned
What do you mean its more than just ram reserve. He said that last quote in reference to Xbox One is a ram reserve thread. Taking 3 gigs from developers for OS functions was proof that it wasn't a gaming machine. But potentially taking 3 gigs from developers on the PS4 was not an issue because the developers said it was plenty and the differences between 5 and 7 wouldn't yield that big of a difference. How would that same principle not apply for developers on Xbox One?

I know its DDR3 vs GDDR5 but lets also keep things in perspective here, the X1's memory bandwidth increased over the 360 by the same amount that the 360 increased over the OG Xbox, and that's 3x.

OG Xbox bandwidth: 6.4 GB/s
Xbox 360 bandwidth: 22.4 GB/s = 3x increase over OG Xbox
Xbox One bandwidth: 68.3 GB/s = 3x increase over 360

So to say the Xbox One is less of a gaming machine is disingenuous unless you believe the Xbox 360 wasn't a gaming machine either.

here here, well said

just becuase the PS4 is better does not make the x1 bad by the next gen standard

and on a side note I was wondering, is the X1's memory solution as much as a puzzle as cerny was making out it to be? I dont understand why it would be, the 360 also had edram but devs liked the 360's RAM and thought it was an easier platform to develop for, so why is esram/edram, + ddr3 RAM setup all of a sudden a puzzle for developer this new gen? when it was the most preferred and efficient setup last gen!


I see it like this

If a dev was grading both next gen console RAM solutions the PS4 would get and A* and the X1 would get an A
 

nib95

Banned
What do you mean its more than just ram reserve. He said that last quote in reference to Xbox One is a ram reserve thread. Taking 3 gigs from developers for OS functions was proof that it wasn't a gaming machine. But potentially taking 3 gigs from developers on the PS4 was not an issue because the developers said it was plenty and the differences between 5 and 7 wouldn't yield that big of a difference. How would that same principle not apply for developers on Xbox One?

I know its DDR3 vs GDDR5 but lets also keep things in perspective here, the X1's memory bandwidth increased over the 360 by the same amount that the 360 increased over the OG Xbox, and that's 3x.

OG Xbox bandwidth: 6.4 GB/s
Xbox 360 bandwidth: 22.4 GB/s = 3x increase over OG Xbox
Xbox One bandwidth: 68.3 GB/s = 3x increase over 360

So to say the Xbox One is less of a gaming machine is disingenuous unless you believe the Xbox 360 wasn't a gaming machine either.

It's not 2005 anymore dude. Next gen graphics require next gen graphical features, physics, simulations and so on that are far more bandwidth centric and demanding. That is why GDDR5 is the defacto choice for video cards today. The only reason Microsoft went with DDR3 is because nobody knew if 4gb sticks of GDDR5 would be available in time, and they needed the large ram quantity for that 3GB OS reserve (so 8GB was essentially a must from the beginning due to system uses being at the fore front of design). Sony lucked out and Microsoft is left with a more limited memory system with much less bandwidth.

With respect to the advantages of GDDR5 over DDR3, there was this awesome post over at TechSpot by a developer.

It goes back to the nature of graphics rendering and how the polygons are drawn. Sorry if I'm teaching my grandmother to suck eggs, but it might be a little easier if I outline the graphics pipeline. I'll use a red coloured font to show the video memory transactions and green for system RAM (probably be better as a flow chart but nvm)

On the software side you have your game (or app) ↔ API (DirectX/OpenGL) ↔ User Mode Driver / ICD ↔ Kernel Mode Driver (KMD) + CPU command buffer→ loading textures to vRAM → GPU Front End (Input assembler) .

Up until this point you're basically dealing with CPU and RAM- executing and monitoring game code, creating resources, shader compile, draw calls and allocating access to the graphics (since you likely have more than just the game needing resources). From here, the workload becomes hugely more parallel and moves to the graphics card. The video memory now holds the textures and the shader compilations that the game+API+drivers have loaded, These are added to the first few stages of the pipeline as and where needed to each the following shaders as the code is transformed from points (co-ordinates) and lines into polygons and their lighting:

Input Assembler (vRAM input) → Vertex Shader (vRAM input) → Hull Shader (vRAM input) → Tessellation Control Shader (vRAM input) (if Tessellation is used) → Domain Shader (vRAM input) → Geometry Shader (vRAM input)

At this point, the stream output can move all or part of the render back into the memory to be re-worked. Depending on what is called for, the output can be called to any part of the previous shader pipeline (basically a loop) or held in memory buffers. Once the computations are completed they then move to Rasterization (turning the 3D image into pixels):

Rasterizer → Pixel Shader* (vRAM input and output) → Output Manager (tasked with producing the final screen image, and requires vRAM input and output)

* The Compute Shaders (if they exist on the card) are tasked with post processing (ambient occlusion, film grain, global illumination, motion blur, depth of field etc), A.I. routines, physics, and a lot of custom algorithms depending on the app., also run via the pixel shader, and can use that shaders access to vRAM input and output.

So basically, the parallel nature of graphics calls for input and output from vRAM at many points covering many concurrent streams of data. Some of that vRAM is also subdivided into memory buffers and caches to save data that would otherwise have to re-compiled for following frames. All this swapping out of data calls for high bandwidth, but latency can be lax (saving power demand) as any stall in one thread is generally lost in the sheer number of threads queued at any given time.
As I noted previously, GDDR5 allows a write and read to/from memory every clock cycle, whereas DDR3 is limited to a read or a write, which reduces bandwidth. Graphics DDR also allows for multiple memory controllers to cope with the I/O functions.

The use of GDDR5 is probably mandatory if you note the likelihood of increased complexity in the next generation console games (higher polygon counts, more complex post process image quality). The PS4 will use an AMD APU, which has already demonstrated that it is very sensitive to memory bandwidth, and given the long life cycle of a console it needs a degree of future proofing by adding as much bandwidth as possible.

But that's mainly why Microsoft put the Esram in there. It's a bandage effort solution to the problem, and will certainly help towards alleviating the issue, but not curing it altogether.
 

KageMaru

Member
here here, well said

just becuase the PS4 is better does not make the x1 bad by the next gen standard

Yup, exactly this. It's almost as if some of these people forgot about the ps2 and how great that system was despite the performance gap.

IMO Adam Sessler said it best that many of these people are just creating a narrative for how they perceive things, not how they really are (or something along those lines).
 
It's not 2005 anymore dude. Next gen graphics require next gen graphical features, physics, simulations and so on that are far more bandwidth centric and demanding. That is why GDDR5 is the defacto choice for video cards today. The only reason Microsoft went with DDR3 is because nobody knew if 4gb sticks of GDDR5 would be available in time, and they needed the large ram quantity for that 3GB OS reserve (so 8GB was essentially a must from the beginning due to system uses being at the fore front of design). Sony lucked out and Microsoft is left with a more limited memory system with much less bandwidth.

With respect to the advantages of GDDR5 over DDR3, there was this awesome post over at TechSpot by a developer.

But that's mainly why Microsoft put the Esram in there. It's a bandage effort solution to the problem, and will certainly help towards alleviating the issue, but not curing it altogether.
Please point out where I said GDDR5 wasn't better. All I was pointing out is that Xbox One isn't any less of a gaming machine because GDDR5 has more bandwidth. The X1 is still a plenty capable piece of hardware in spite of losing to the PS4 in specs.
 

No Love

Banned
I think Carmack saying these two consoles are close in power is like saying the Moon and Earth are close together: yes, if you look at the distance between Earth/Moon vs. distance between objects in our solar system, yep, they seem pretty close.

But if you're on Earth looking at the moon or vice versa... they're very far away from each other. Sony has a big advantage this generation, and it'll really start to show in 2-3 years with console exclusives. I really don't think Microsoft's 1st party games will come anywhere near what Naughty Dog, Sony Santa Monica, Guerrilla Games, Sucker Punch, and Polyphony will pump out this gen. Sony has a massive advantage in developer talent at their disposal.

It kinda sucks, because the 360 was so well designed and in many ways was superior to PS3 this gen. :\
 

imtehman

Banned
Top Bottom