• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox 360 vs. PS3, aka "these dicks haven't still been measured enough, must continue still"

Virex

Banned
My Atari Jaguar shits all over current, old and next-gen systems. The Jaguar eats monsters that eat monsters that eats monsters for breakfast. Fucking kneel before the power of the Jaguar you ignorant savages.
 

SonGoku

Member
My Atari Jaguar shits all over current, old and next-gen systems. The Jaguar eats monsters that eat monsters that eats monsters for breakfast. Fucking kneel before the power of the Jaguar you ignorant savages.
How many lynxes ducked taped together?
 

Ar¢tos

Member
Zx spectrum >>>>> 7th gen
Back to Skool had more depth than most AAA games nowadays.
And Chaos: The Battle of Wizards is more engaging and fun than most current MP games.
 

Quezacolt

Member
If we were to compare it from the point of view of a normal consumer, i'd say that the 360 was much better. The OS of the ps3 is so slow, installing games is slow, downloading, opening stuff, etc... When it comes to the quality of controller, the ps3 also lost. they were cheap and would broke easily.

Sony was lucky that they had great exclusives, and that blu-ray catch on, if it wwasnt for those 2 things, ps3 should have been a total failure.
 

Romulus

Member
Back then, I was always surprised at how the PS3 was a year newer than 360 yet didn't produce any meaningful results for years to come. Original Xbox came out about a year later than ps2 and was a massive step up from the start.
Actually even in the last year, 360 was still getting better looking multiplatforms for the most part. I know it was because of the cell, but back then I didn't understand, and even now it doesn't really matter how it happened, they just screwed up.

As a Sony fan during the ps1, ps2 years especially, I was not a fan of ps3. It sticks out as the ugly duckling for me in terms of exclusives and multiplatforms. There was a fair amount of them, but nothing ever really impressed me much, even compared to PS4 which is a step back in ways to the older playstations. I actually like PSVR exclusives more than ps3's.

1) PS1
2)PS2
3) PS4
4) PSVR
5) PS3
 
Last edited:
The C64 shit all over the Spectrum.
Well it does, I didn't own any back in the days, but if I had to pick one as my gaming platform for the foreseeable future I would take the c64 or not game at all... Sorry soeccy fans 🤮.

Back on topic.
Original Xbox came out about a year later than ps2 and was a massive step up from the start.
The ps2 was a pretty decent jump over the Dreamcast (the Dreamcast had a better memory setup, but in the end the ps2 could render better lighting and more polygons).
The Dreamcast was a pretty good jump over the n64, etc.

Moore's law was in full effect back in those days. Also, as much as I liked some stuff on the ps3, it was an overengineered piece of hardware, that made releasing games that pushed it hard way more complex than it needed to be.
 
Last edited:

LordOfChaos

Member
Here's what closes the book for me.

To illustrate the peculiarities of Cell programming, we use the Breadth-First Search (BFS) on a graph. Despite its simplicity, this algorithm is important because it is a building block of many applications in computer graphics, artificial intelligence, astrophysics, national security, genomics, robotics, and the like.

Listing One is a minimal BFS implementation in C. Variable G contains the graph in the form of an array of adjacency lists. G.length tells how many neighbors the i-th vertex has, which are in G.neighbors[0], G.neighbors[1], and so on. The vertex from which the visit starts is in variable root. A BFS visit proceeds in levels: First, the root is visited, then its neighbors, then its neighbors' neighbors, and so on. At any time, queue Q contains the vertices to visit in the current level. The algorithm scans every vertex in Q, fetches its neighbors, and adds each neighbor to the list of vertices to visit in the next level, Qnext. To prevent being caught in loops, the algorithm avoids visiting those vertices that have been visited before. To do so, it maintains a marked array of Boolean variables. Neighbors are added to Qnext only when they are not already marked, then they get marked. At the end of each level, Q and Qnext swap, and Qnext is emptied.


Normal CPU:
Code:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <unistd.h>

/* ... */

/* the graph */
vertex_t * G;

/* number of vertices in the graph */
unsigned card_V;

/* root vertex (where the visit starts) */
unsigned root;

void parse_input( int argc, char** argv );

int main(int argc, char ** argv)
{
  unsigned *Q, *Q_next, *marked;
  unsigned  Q_size=0, Q_next_size=0;
  unsigned  level = 0;

  parse_input(argc, argv);
  graph_load();

  Q      =
          (unsigned *) calloc(card_V, sizeof(unsigned));
  Q_next =
          (unsigned *) calloc(card_V, sizeof(unsigned));
  marked =
          (unsigned *) calloc(card_V, sizeof(unsigned));

  Q[0] = root;
  Q_size  = 1;
  while (Q_size != 0)
    {
      /* scanning all vertices in queue Q */
      unsigned Q_index;
      for ( Q_index=0; Q_index<Q_size; Q_index++ )
      {
        const unsigned vertex = Q[Q_index];
        const unsigned length = G[vertex].length;
        /* scanning each neighbor of each vertex */
        unsigned i;
      for ( i=0; i<length; i++)
          {
            const unsigned neighbor =
              G[vertex].neighbors[i];
      if( !marked[neighbor] ) {
            /* mark the neighbor */
            marked[neighbor]      = TRUE;
            /* enqueue it to Q_next */
            Q_next[Q_next_size++] = neighbor;
          }
        }
      }
      level++;
      unsigned * swap_tmp;
      swap_tmp    = Q;
      Q           = Q_next;
      Q_next      = swap_tmp;
      Q_size      = Q_next_size;
      Q_next_size = 0;
    }
  return 0;
}

Becomes this to do on an SPE

On a Pentium 4 HT running at 3.4 GHz, this algorithm is able to check 24-million edges per second. On the Cell, at the end of our optimization, we achieved a performance of 538-million edges per second. This is an impressive result, but came at the price of an explosion in code complexity. While the algorithm in Listing One fits in 60 lines of source code, our final algorithm on the Cell measures 1200 lines of code.





Cell was an interesting, novel design choice, but ultimately gave no consideration to developer time and budgets, and that's why it failed outside of where Sony would fund a few close nit studios with nearly unlimited budgets. Would it be interesting to see where a continuation of it would have gone in a universe simulator, sure, but ultimately we live in a world where companies have to turn profits and saving silicon budget on the things that make programming easier and shifting that to the programmers was not a winning strategy. Besides that, what it was good at is covered by graphics compute now.
 
Last edited:

CrustyBritches

Gold Member
Is this the meeting room for Objectum Sexualaholics Anonymous? My name is Crusty, and to be truthful I still have a 7th-gen cyber fetish, just now I do it under a different name.

NEOHAB1.gif
 
actually PS3 had a bit more usable for games because the PS3’s OS reserved less memory than 360’s OS but the difference is few MB.
PS3 OS reserves a total of 50MB, while XBOX 360 OS reserves 32 MB.

They couldn't even implement cross-game chat due to RAM allocation.


 
Last edited:

Bernkastel

Ask me about my fanboy energy!
Its wrong... You said double the ram, a disingenuous statement
PS3 has 512MB total ram available for games split between two pools
PS3 has 512MB RAM... 256MB DDR3 + 256MB XDR.

Both have the same amount of RAM... actually PS3 had a bit more usable for games because the PS3’s OS reserved less memory than 360’s OS but the difference is few MB.
OK, didn't notice.
 

LordOfChaos

Member
PS3 OS reserves a total of 50MB, while XBOX 360 OS reserves 32 MB.

They couldn't even implementcross-game chat due to RAM allocation.



Incidentally, a difference of 4.5 expansion packs

Nintendo-64-Memory-Expansion-Pak.jpg
 

Mochilador

Member
That's a tough one, I really like both. Spent most of the last gen with the Xbox 360.
Since we are talking about it, I decided to finish my PS3 backlog until the end of this year.
 

pawel86ck

Banned
Hmmm, interesting. Worse framerate than a PS3?

I guess the high vs low-level API also makes things even worse... AAA games on PS3 used libGCM.
Worse framerate and stuttering on top of that. 8800 GTX/Ultra had just 768MB VRAM while GTA5 requires around 1.5GB even at minimum settings.
 
Last edited:

SirTerry-T

Member
PS3 has 512MB RAM... 256MB DDR3 + 256MB XDR.

Both have the same amount of RAM... actually PS3 had a bit more usable for games because the PS3’s OS reserved less memory than 360’s OS but the difference is few MB.

We always had to make sure our textures would fit into the shitty amount of memory that was be left for them on the PS3, it was less of a problem on the 360. So while the two machines had similar amounts of ram, the different ways the two machines utilised that memory gave the edge to the 360.
At least in my experience of working on titles for them both.
 

shark sandwich

tenuously links anime, pedophile and incels
Early in the generation: 360 just totally kicked PS3’s ass. MUCH better selection of games (including Japanese games) better online, and multiplats ran better on 360.

Mid-generation: pretty even. More games ran better on 360, but games were being better optimized for PS3. PS3 got some actual compelling exclusives, and lots of timed-exclusive 360 games made their way to PS3

Late-generation: PS3 by a landslide. Microsoft was spending most of their effort chasing after that Kinect casual audience, plus 360’s later OS designs sucked. Meanwhile PS3 was getting some killer exclusives and some amazingly impressive games that showed what that old system could do.


Overall, I’d personally give the edge to 360. Early 360 was one of my favorite eras in gaming, up there with Dreamcast. By the time PS3 pulled ahead, I had a gaming PC that kicked the crap out of both systems so I was doing most of my gaming there.
 
Last edited:

ethomaz

Banned
We always had to make sure our textures would fit into the shitty amount of memory that was be left for them on the PS3, it was less of a problem on the 360. So while the two machines had similar amounts of ram, the different ways the two machines utilised that memory gave the edge to the 360.
At least in my experience of working on titles for them both.
I think the unified pool give a lot of advantages over the PS3’s setup plus to use the 256MB XDR you had to use the Cell’s FlexIO bus.

PS3 OS reserves a total of 50MB, while XBOX 360 OS reserves 32 MB.

They couldn't even implement cross-game chat due to RAM allocation.


you are correct.
 
Last edited:
My Atari Jaguar shits all over current, old and next-gen systems. The Jaguar eats monsters that eat monsters that eats monsters for breakfast. Fucking kneel before the power of the Jaguar you ignorant savages.
No, while I show respect for most under rated consoles, most have a number of actually good, or even great games (the 3DO and 32x, from the same time area) are good examples of this.

The Jag is not redeemable, it's like a really weak Nintendo 64 (cartridges) it was worse than the 3DO at pretty much everything, and offered a really weak library, I mean at least the 32x had some pretty good arcade ports or Sega games!
 

"With the console versions of Assassin's Creed Rogue, there's a definite sense that the conversion work across both platforms isn't as closely matched as 2013's Black Flag. The Xbox 360 version is softer and noticeably fuzzier than the PS3 release: while both versions utilise a form of FXAA that attempts to mimic traditional multi-sampling style coverage across edges (but considerably blurring the image in the process), the PS3 version renders natively at 720p whereas a sub-HD resolution is in place on the Microsoft console. Pixel counting puts the ballpark native resolution on the 360 at around 1200x688.

Beyond the framebuffer set-up, we find the core art and most of the effects work is interchangeable between PS3 and 360, although there are some unexpected differences between the two platforms that were not present in Black Flag. For one, SSAO is present on PS3, helping to add depth to characters and the environment, while on Xbox 360 the effect is completely absent, lending more brightly lit scenes a generally flatter appearance. Secondly, in most cases we find that streaming is generally slower on 360, with low resolution textures left on-screen (sometimes without normal maps) for a few seconds during changes in camera angles in some cut-scenes, and when transitioning to gameplay. The situation is much improved on PS3, where the majority of the best quality assets are usually loaded in before the scene begins."


This game was released in 2014, one year after the next-gen consoles. It's clear to me that Ubisoft had finally mastered the PS3 architecture ("too little, too late" someone could argue).

There's also this one which got a huge upgrade (720p30 -> 1080p60) after they refactored the code:


Imagine if Bayonetta 1 had gotten the same treatment. We will never know. :)

Worse framerate and stuttering on top of that. 8800 GTX/Ultra had just 768MB VRAM while GTA5 requires around 1.5GB even at minimum settings.
This proves once again that console hardware punches above its weight, no matter if it's exotic or PC-based.

GeForce 8800 GTX on a PS3 would do wonders with a low-level API.

To be fair, PCs these days actually have a low-level API (DX12, Vulkan), but it's not the de facto API (yet). Major AAA games like AC Odyssey still use DX11 and suffer from API inefficiencies causing increased CPU overhead.

We always had to make sure our textures would fit into the shitty amount of memory that was be left for them on the PS3, it was less of a problem on the 360. So while the two machines had similar amounts of ram, the different ways the two machines utilised that memory gave the edge to the 360.
RSX can use up to 480MB of RAM (an extra 224MB XDR) via an ultra-fast bus (FlexIO).

It's no coincidence that games like Uncharted 2/3 and TLOU1 had high-res textures vs 3rd party games being blurry AF.
 
Last edited:

Trimesh

Banned
This is an urban myth most likely, since computer parts in EU have always had the full fat VAT tax.

Other stuff (like food/medicine) have a reduced rate.

Don't know about the PS3, but at the time the PS2 was being produced there was definitely a difference in duty rates - that was why the SCEE version of the pack-in demo disc came with a copy of YABASIC on it, as part of an (unsuccessful) attempt to have the machine classified as a computer rather than a games machine.

I don't know if anything similar was attempted with the PS3, but I haven't heard of it.
 

HeresJohnny

Member
The 360 was amazing... a true everyman's machine, up until they fucked themselves with flail controls. Once Kinect came out, the Xbox brand was badly damaged and they doubled down by bundling that fucking turd in with their next system, which already had a ton of other issues besides being bundled with a bullshit controller.

The PS3 started off as a disaster, but they recovered from it somewhat. Still, when you look at how far Sony fell from the PS2, it was pretty bad for them.
 

Yoshi

Headmaster of Console Warrior Jugendstrafanstalt
What even is the controversy? PS3 was technically more advanced, but in a way that was difficult to make use of, especially in multiplatform games, which lead to an advantage for 360 in most mutliplatform games.
 
What even is the controversy? PS3 was technically more advanced, but in a way that was difficult to make use of, especially in multiplatform games, which lead to an advantage for 360 in most mutliplatform games.
That isn't true. For others in the thread :

The ps3s gpu was so weak in comparison to 360 that the only thing the cell could do was bridge that gap. There was no.overall higher computing power.

In addition, the cell could not help in regards to the rsx's weakness in alpha effects, or msaa processing. Take a look at mw3 on 360 vs. Ps3 as evidence for this. If you'll notice, all the graphically pushing exclusives on ps3 don't rely on alpha effects much and often had no msqa, or very little, until mlaa became a thing. God save us from quincunx AA.

Lastly, the 360 had a lighter OS, so a bit more memory for games, unified architecture which for the most part was better, and finally, very fast eDRAM which was hugely beneficial.
 
Last edited:

LordOfChaos

Member
Geforce 7 wouldn't suffice for sure, so we can agree on that. Even GTA5 (and that was multiplatform game) wouldn't run on 8800 GTX / Ultra


Was this running at 720p to match the consoles outputs?

That potato masher guy should get back to it and tell us how it's holding up to this generations consoles

 

BlackTron

Member
So... just popping in here to see if we've found out who has the biggest dick yet?

No? OK, carry on.

While they both got bigger and smaller throughout the battle, max measurement was ultimately exactly the same.

However, only one of the two was black.
 

thelastword

Banned

"With the console versions of Assassin's Creed Rogue, there's a definite sense that the conversion work across both platforms isn't as closely matched as 2013's Black Flag. The Xbox 360 version is softer and noticeably fuzzier than the PS3 release: while both versions utilise a form of FXAA that attempts to mimic traditional multi-sampling style coverage across edges (but considerably blurring the image in the process), the PS3 version renders natively at 720p whereas a sub-HD resolution is in place on the Microsoft console. Pixel counting puts the ballpark native resolution on the 360 at around 1200x688.

Beyond the framebuffer set-up, we find the core art and most of the effects work is interchangeable between PS3 and 360, although there are some unexpected differences between the two platforms that were not present in Black Flag. For one, SSAO is present on PS3, helping to add depth to characters and the environment, while on Xbox 360 the effect is completely absent, lending more brightly lit scenes a generally flatter appearance. Secondly, in most cases we find that streaming is generally slower on 360, with low resolution textures left on-screen (sometimes without normal maps) for a few seconds during changes in camera angles in some cut-scenes, and when transitioning to gameplay. The situation is much improved on PS3, where the majority of the best quality assets are usually loaded in before the scene begins."


This game was released in 2014, one year after the next-gen consoles. It's clear to me that Ubisoft had finally mastered the PS3 architecture ("too little, too late" someone could argue).

There's also this one which got a huge upgrade (720p30 -> 1080p60) after they refactored the code:


Imagine if Bayonetta 1 had gotten the same treatment. We will never know. :)


This proves once again that console hardware punches above its weight, no matter if it's exotic or PC-based.

GeForce 8800 GTX on a PS3 would do wonders with a low-level API.

To be fair, PCs these days actually have a low-level API (DX12, Vulkan), but it's not the de facto API (yet). Major AAA games like AC Odyssey still use DX11 and suffer from API inefficiencies causing increased CPU overhead.


RSX can use up to 480MB of RAM (an extra 224MB XDR) via an ultra-fast bus (FlexIO).

It's no coincidence that games like Uncharted 2/3 and TLOU1 had high-res textures vs 3rd party games being blurry AF.
Money post, this looks exactly like the posts I use to make in older DF threads on this here forums....

You notice also that all the AC games started looking just as good on PS3, they ditched Quincunx, so when DF spoke of better textures in AC, it was just the quincunx AA blurring the early PS3 AC games.....I still remember folks talking about Bayonetta as if that meant anything, if Bayonetta was done by Nixxes/Bluepoint/Hexadrive or ground up at platinum
it may have looked and performed even better than the 360 version.....Then look at Platinum's Vanquish on PS3, which was done in-house on PS3....

Vanquish

"In terms of graphical differences between the two games, it's fair to say that there aren't any of real significance - certainly nothing that might affect a purchasing decision: both run at 1024x720 resolution with 2x multi-sampling anti-aliasing (MSAA). Vanquish is a spectacular-looking game that throws out an astounding assortment of effects and the overall look is exactly the same on both platforms. Just about the only issue we did notice on a consistent basis was a difference in shadowing, which seems to favour PS3. In these shots you can see shadowing that's either omitted completely or else suffering from "shadow acne" on the 360. "

https://www.eurogamer.net/articles/digitalfoundry-vanquish-face-off


You remember when folk said ZOE could not run better on PS3 it was bandwidth starved, couldn't handle all that alpha? Kojima showed them that was a lie, the game went from a sub 30fps abhorration at 720p with no AA, with everything improved as much as 10x in performance and IQ and now targeting 60fps....A game which fell to the teens at 720p, with reduced alpha resolution and no AA was now internally rendered at 1280 x 1080p with alpha resolution upshot, lots of AA for smooth IQ and targetting 60fps which it holds 99% of the time....In truth, most of the multiplat devs never really programmed to the PS3 architecture, they only used RSX, I could only imagine if RSX was stronger (perhaps if it had my (8800 GTX Ultra), not for the multiplat devs, but for first party devs, I think all games on PS3 would be 1920 x 1080p with much better effects and features. Imagine what GOW3, Ascension, UC2/3, Killzone 2/3, GT5/6...The best looking games in that generation would have looked like with an even stronger GPU+Cell and placed in the hands of SWWS...

Remember pretty much all first party games from Sony was 720p or higher, unlike MS first party with it's "heralded xenos".....GT5 run 1280 x 1080, GT6 run 1440 x 1080, RR7 run 1080p 60fps, whilst most of the 360 games were only 720p......Forza, had no dynamic weather, shadows or TOD like GT5...and people still question which console was more powerful...….RSX+Cell properly used, simply destroyed Xenos+EDRAM…..


An example....

ZOE2

"One of the key changes implemented by HexaDrive to solve the performance problem is the utilisation of the PlayStation 3's SPUs to handle the workload previously designed for PlayStation 2's vector units. ZOE 2 pushed the PlayStation 2 to its limits and the RSX simply doesn't have the muscle on its own to power through it without help. HexaDrive balanced performance across the entire system, including the SPUs, in order to avoid bottlenecks, resulting in a 10x increase in performance across the board.

Freeing up the RSX also allows for an increase in image quality as well as frame-rate. HexaDrive has taken a three pronged approach to image quality starting with an output resolution of 1080p complete with higher resolution HUD, text, and menu elements. In-game we see 3D elements rendered internally at a resolution of 1280x1080 with FXAA used to smooth out aliasing. Nvidia's post-process effect was selected as a result of its speed and the game's reliance on alpha test transparency, which doesn't play nicely with standard MSAA. On the flipside, 32x MSAA was used specifically for the display of wireframe elements which do not work well with FXAA. While we would have liked to have seen the in-game visuals rendered at 1920x1080, the results still manage to provide a smooth and clean image that meshes perfectly with the games aesthetic and it still represents an extra 50 per cent boost in detail over High Voltage's initial efforts.
Another hallmark of the series is its heavy usage of particles and alpha effects. Laser trails, explosions, and smoke are pushed to the breaking point on PlayStation 2 requiring the use of a lower resolution alpha buffer. This same technique is also employed in the original HD release producing even more noticeably chunky effects as a result of the higher resolution. Such effects are among the most expensive on current generation consoles, particularly the PlayStation 3, so it's surprising to find the updated release now rendering alpha effects at full resolution. Particle density is also restored to match the original PS2 version, which was also drastically reduced in the initial HD release.

Another core focus of this patch concerns matching the visual effects work seen in the PS2 original as closely as possible. To that end we find that textures originally exhibiting colour banding are cleaned up and corrected to match the PS2 release. PS2 games were often limited to 8-bit palette textures as a result of memory limitations and no longer have proper support on modern hardware. HexaDrive has implemented a means to properly emulate the appearance of such textures on modern hardware allowing for smoother blending of colours. It's a detail that is missing in a number of other PlayStation 2 ports going all the way back to the original Xbox release of Metal Gear Solid 2.

In addition, various on-screen elements including the map display are properly adjusted for 16:9 where previously they were stretched. The appearance of fog, shading and bloom lighting are all adjusted to more closely match the original PS2 version as well. While subtle differences remain between the PS2 release and this updated HD release, the overall appearance of the game is much more accurate than it was previously.

Another important visual effect missing from the original HD release is depth of field. Kojima games are known for their cinematic quality and depth of field plays a large role in this. Going back as far as the original Metal Gear Solid we see varying implementations of this cinematic effect, so it's safe to say that its removal is surprising. Depth of field is rendered at a lower resolution on PlayStation 2 similar to other alpha effects so perhaps, as a result of the original implementation, the effect simply didn't play well at 1280x720. Version 2.0 not only reintroduces depth of field but does so using modern techniques. The result is a smooth, higher resolution effect complete with bokeh. The effect is most prominent throughout the games cinema sequences but it is also used to enhance numerous gameplay sequences as well.

It's clear that a lot of work was put into this patch, which finally treats the game - and its fans - with the respect they deserve, but we remain mystified as to why these HD remasters were released in such a shocking state to begin with. Bearing in mind the quality of the original ZOE HD release and its Silent Hill counterpart, we can only assume that budgets took priority over properly remastering these games in a high definition format. That said, we also see that lessons can be learned and perhaps this investment will help to avoid such costly mistakes in the future. With the next generation of consoles on the horizon, the days of porting PlayStation 2 software are no doubt numbered but it's a safe bet that we'll see PS3 and 360 games ported to their successors somewhere down the line. Let's hope that the example of Zone of the Enders will inspire publishers to get their future remasters right the first time - after all, at the core, these releases are fan service, and nostalgia-fuelled gamers are a clearly a highly discerning bunch."

https://www.eurogamer.net/articles/digitalfoundry-how-konami-remade-zoe-hd-remaster
 

LordOfChaos

Member
My PS3 ran PS1 and 2 games natively, and emulated SNES games through Linux Ubuntu.

There's nothing you could ever say to convince the 360 was a better system.

God I want Other OS back in the 9th gen, if they truly use an SSG like solution where the GPU can see the NAND as another layer of framebuffer and uses the VRAM to cache that, that's doing what I can't do without a $5999 card. Now to get that in a 4-500 box and we're back to where the PS3 had interesting compute applications as something you couldn't get anywhere else for that cost.

Oh wait..Now I'm ruining the last gen thread with next gen talk rather than the other way around lol.
 

SirTerry-T

Member
To be fair, it's not like the 360 went out with a whimper, at least graphically...the Rise of The Tomb Raider port was pretty bloody impressive, even out side of the pre-rendered cutscenes.

 

Ten_Fold

Member
From 05-10 the 360 was much better, overall better games, better community and they had a good amount of jrpgs which was surprising. I think around 2011 the ps3 really started to take over and Sony held that lead strong going into the ps4.
 
Bethesda never learned how to do the ps3. Oblivion, fallout 3 and skyrim would all die slow deaths the larger your save file became. 50 hours in you were plodding along at 2-3 fps.

Oh bethesda you incompetent hacks you.
 

Bogroll

Likes moldy games

"With the console versions of Assassin's Creed Rogue, there's a definite sense that the conversion work across both platforms isn't as closely matched as 2013's Black Flag. The Xbox 360 version is softer and noticeably fuzzier than the PS3 release: while both versions utilise a form of FXAA that attempts to mimic traditional multi-sampling style coverage across edges (but considerably blurring the image in the process), the PS3 version renders natively at 720p whereas a sub-HD resolution is in place on the Microsoft console. Pixel counting puts the ballpark native resolution on the 360 at around 1200x688.

Beyond the framebuffer set-up, we find the core art and most of the effects work is interchangeable between PS3 and 360, although there are some unexpected differences between the two platforms that were not present in Black Flag. For one, SSAO is present on PS3, helping to add depth to characters and the environment, while on Xbox 360 the effect is completely absent, lending more brightly lit scenes a generally flatter appearance. Secondly, in most cases we find that streaming is generally slower on 360, with low resolution textures left on-screen (sometimes without normal maps) for a few seconds during changes in camera angles in some cut-scenes, and when transitioning to gameplay. The situation is much improved on PS3, where the majority of the best quality assets are usually loaded in before the scene begins."


This game was released in 2014, one year after the next-gen consoles. It's clear to me that Ubisoft had finally mastered the PS3 architecture ("too little, too late" someone could argue).

There's also this one which got a huge upgrade (720p30 -> 1080p60) after they refactored the code:


Imagine if Bayonetta 1 had gotten the same treatment. We will never know. :)


This proves once again that console hardware punches above its weight, no matter if it's exotic or PC-based.

GeForce 8800 GTX on a PS3 would do wonders with a low-level API.

To be fair, PCs these days actually have a low-level API (DX12, Vulkan), but it's not the de facto API (yet). Major AAA games like AC Odyssey still use DX11 and suffer from API inefficiencies causing increased CPU overhead.


RSX can use up to 480MB of RAM (an extra 224MB XDR) via an ultra-fast bus (FlexIO).

It's no coincidence that games like Uncharted 2/3 and TLOU1 had high-res textures vs 3rd party games being blurry AF.
You haven't mentioned framerate though.
From DF
" Less demanding moments see the Xbox 360 game frequently operate between 35-40fps, while PS3 frame-rates often fall between 28-32fps in similar situations. In both cases we're looking at an improvement over a general run of play in Black Flag but it's worth bearing in mind that the locations on offer during the first few hours of Rogue are smaller and generally less populated than some of those found in the previous game, leading to higher frame-rates and smaller dips in performance.
"Even so, the fluctuating frame-rates around the 30fps target cause noticeable judder, though the 360 acquits itself more confidently here with the stuttering less impactful than PS3. As expected, scenes featuring heavy alpha effects also operate more smoothly owing to the bandwidth advantage offered by the eDRAM: Microsoft's console more closely holds to 30fps when sailing in foggy weather conditions, whereas we see frame-rates hit much harder on the PS3, with frame-rates often falling into the mid-twenties. That said, detailed environments cause both consoles to dip below 30fps in a similar fashion, and sometimes we see the PS3 gain the advantage when bandwidth isn't a limiting factor - although, this doesn't happen too often. "


And just maybe devs didn't have to put as much effort into 360 so they didn't squeeze more out of it than it could have done. Just a thought.
 
Last edited:
You haven't mentioned framerate though.
From DF
" Less demanding moments see the Xbox 360 game frequently operate between 35-40fps, while PS3 frame-rates often fall between 28-32fps in similar situations. In both cases we're looking at an improvement over a general run of play in Black Flag but it's worth bearing in mind that the locations on offer during the first few hours of Rogue are smaller and generally less populated than some of those found in the previous game, leading to higher frame-rates and smaller dips in performance.
"Even so, the fluctuating frame-rates around the 30fps target cause noticeable judder, though the 360 acquits itself more confidently here with the stuttering less impactful than PS3. As expected, scenes featuring heavy alpha effects also operate more smoothly owing to the bandwidth advantage offered by the eDRAM: Microsoft's console more closely holds to 30fps when sailing in foggy weather conditions, whereas we see frame-rates hit much harder on the PS3, with frame-rates often falling into the mid-twenties. That said, detailed environments cause both consoles to dip below 30fps in a similar fashion, and sometimes we see the PS3 gain the advantage when bandwidth isn't a limiting factor - although, this doesn't happen too often. "

And just maybe devs didn't have to put as much effort into 360 so they didn't squeeze more out of it than it could have done. Just a thought.
Framerates are usually CPU-related. XBOX 360 has 3 PPE cores vs 1 PPE on PS3.

IIRC, Black Flag (which uses the same engine) takes advantage of dual-core CPUs, so it could be this.

Why would they not take care of the XBOX 360 version? AC traditionally had the 360 as a lead platform (differences were huge during the AC1/2 era, despite the 360 version having crushed blacks).

PS3 had the SPUs to accommodate extra effects like SSAO. Uncharted 2 also had SSAO running as a compute job since 2009. It only took Ubi 5 years to catch up with ND.

Far Cry 3 is also another game that made good use of the SPUs: https://n4g.com/news/1125962/the-spus-are-hungry-maximizing-spu-efficiency-on-far-cry-3

What's strange to me is that the PS4/XB1 remaster is locked at 30 fps and there's no option for unlocked framerates.

PPE is an in-order dual-issue PowerPC uarch, while Jaguar is an OoO dual-issue x86 uarch. Surely it could afford higher framerates, especially on PS4 Pro/XB1X.

AC3 via 360 BC goes up to 60 fps on XB1, so there's no excuse for remasters being locked at 30 fps.

Bethesda never learned how to do the ps3. Oblivion, fallout 3 and skyrim would all die slow deaths the larger your save file became. 50 hours in you were plodding along at 2-3 fps.

Oh bethesda you incompetent hacks you.
I wouldn't say their PS4 efforts are amazing either.

Skyrim (PS4 remaster) is still locked at 30 fps, while the PSVR version runs at 60 fps.

Fallout 76 is probably the worst offender of their dated engine.
 

jakinov

Member
I bought all 3 console near their launch so had them all for a very long time. The PS3 exclusively had some of my favorite games last gen; however, as a platform, Sony made a lot of stupid decisions. They came out with less features than the 360 which came out a year before and couldn't deliver new features fast or well enough. The PS3 was great as a gamer who wanted to play specific games but it wasn't the place I wanted to play my games. It's like Mac being your favorite operating system and needing to use Windows because you want to use software only available on Windows; you ultimately use Windows but you don't want to..

Sony really screwed up when planning out their console; most of their hardware decisions ended up causing problems (and they kept stripping out hardware features as time went by).

  • Cell Processor: Hard to develop for; lost them a handful of dev support resulting in free timed or console exclusives. Performance issues. Screwed up Backwards Compatibility for PS4. As the customer it resulted in worst performing multi-platform games and less games to play.
  • Split-RAM: Made it harder for developers, developers explicitly complained for certain games and then explicitly asked Cerny to unify it according to him. It also lead to performance issues according to some devs.
  • Blu-Ray: Slow seek times leading to mandatory installs in contrast to 360s optional install. Some games initially taking up to 30 minutes to install and taking up to 5GB of a 20-80GB GB ( where 10-15GB being used by the OS). Developers started preventing the install by essentially wasting the extra space on blu-ray discs by duplicating data.
  • Rechargeable controllers are great but they give you a short USB cable (of course you can always go out and buy a longer one) and didn't think to let you charge the controller while in standby mode like you can do today with the PS4.
  • They also should have copied the 360's extension and audio port being on the bottom like they ended up doing for the PS4. The keyboard attachment was awkward and not bundling a headset resulted in a less vibrant online community.
Last generation was when game consoles got very powerful ; could run sophisticated operating systems and was a time we saw a lot of innovation when it came to the Internet. Sony showed a lack of ambition of at least readiness with the PS3. The OS was bad and it lacked a lot of online features. Even if they implemented the features it took them a very long time. Bad UX too.

  • In-Game/App Interface - Took almost 2 years before we got an in-game interface; that was sluggish; most of the features didn't work like changing settings.
    • Filled with useless options you couldn't actual use and could have been a more streamlined interface (less button presses)
    • In-Game Music: Was implemented late and not universal got minimal
    • Doesn't work in some apps like PlayStation store, certain videos apps or Blu-ray.
  • Online Game Session Integration:
    • No Join via Friends List (Implemented ~3 (?) years after launch)
    • No Invite via Friends List (Implemented ~3 (?) years after launch)
    • Multiple PlayStation Network accounts (Implemented ~3 (?) years after launch; sometime after Resistance 2)
    • Limited Status Messages (Improved ~2 (?) years after launch; I believe right before LBP or LBP2)
  • Trophies- took almost 2 years before they implemented this feature
    • Trophy syncing was not done in the background and had to look at a loading screen to sync trophies that was very very slow the longest for me being around 40 minutes.
    • Adding it somewhat late resulted in spotty support in the beginning
    • Trophy compare was also super slow.
  • Multiple Concurrent Users/Profiles
    • The system only lets one user log in opposed to the 360 and PS4 where each controller gets tied to a profile when turned on.
    • They added APIs eventually (?; I don't think it was available for Resistance: FoM) that let specific developers ask for a profile in their game. But it was extra work for the dev to implement instead of it being something native so they had to do it like on PS4/360.
    • It led to some spotty save file support for player 2+ (i.e. not getting progress on their own profile) and the second-player not being able to unlock trophies for a lot of games. I believe LBP2 circumvented this issue by reading your save file and giving you trophies the next time you open the game on your own profile.
  • PSN Profile System
    • Games didn't implement it like Xbox Live did where u can click view profile in the game; which I assume there was no API for it considering there was no in-game UI
    • No viewing friends of friends
    • Ugly gray rectangle in middle of screen for profile until it was an ugly colored rectangle in the middle of your screen.
    • Took years before they figured out how to sell people new avatars
    • Extremely slow to load
    • Couldn't change your username
  • Communication
    • No cross-game-chat
    • No party system
    • No voice messaging
    • the email template system was dumb; we don't need a subject line; we don't need to see "re:re:re:re:re Subject line of message ends up getting used instead of body sometimes too"
  • UX/Misc.
    • No context-aware PS Button that's available on Steam Big Picture, 360 and PS4 i.e. when a popup comes up pushing the PS button goes to the popup's context
    • No sounds on system notifications
    • There was no quick password for PSN
    • Having the local profiles being loosely coupled with the PSN ID wasn't great UX. e.g. having two names for your account and two avatars.
    • Took them a few years to had an indicator that you got new messages instead of you manually going to check your inbox
    • Needing to do long install for everything from PSN. On 360, PS4, and XB1 the installs are transparent to the user and doesn't visibly extend the time it takes o play. Installs from the PSN store have taken up to 60 minutes for me depending on the content. During that time you have to cancel if you want to do anything else on the system.
    • Updates in general on the system were very slow.
    • PSN download speeds appeared to be slow (especially in the beginning).

      pennyarcade20090403-685x343.jpg
Sony did an amazing job with the PS4 though. Not only did they fix all the stuff I mentioned but they even added cool new features Microsoft didn't even do. and they arguably have better UX than Xbox atm. Also now when Microsoft has a feature that PlayStation doesn't this generation; Sony copied it very fast and either just as good or better. Now the PS4 is the console I want to play my games on (though I play mostly on PC nowadays).
 

Riven326

Banned
Bethesda never learned how to do the ps3. Oblivion, fallout 3 and skyrim would all die slow deaths the larger your save file became. 50 hours in you were plodding along at 2-3 fps.

Oh bethesda you incompetent hacks you.
It would be more accurate to say Bethesda never learned.
 

zcaa0g

Banned
I still have my PS3 hooked up with a number of physical and digital titles and use it frequently enough by today's standards, but in the end compared to the XBOX 360, it's definitely the AliasStation.
 

SonGoku

Member
Bethesda never learned how to do the ps3. Oblivion, fallout 3 and skyrim would all die slow deaths the larger your save file became. 50 hours in you were plodding along at 2-3 fps.

Oh bethesda you incompetent hacks you.
The Bethesda hacks can't even work the ps4 properly lol
And just maybe devs didn't have to put as much effort into 360 so they didn't squeeze more out of it than it could have done. Just a thought.
3rd party devs went on record stating that PS3 optimizations benefited 360 code making it run better
Off the top of my head i can name dice but i know there were others with similar comments
 
Last edited:

LordOfChaos

Member
3rd party devs went on record stating that PS3 optimizations benefited 360 code making it run better
Off the top of my head i can name dice but i know there were others with similar comments

If you carefully hand tune memory layouts and access like you had to to maximize an SPE for example, it definitely helps everything, just not as much as something with nearly null branch prediction like an SPE. Same as writing tiny micro programs that fit in a 256kb local memory ends up helping a cache.

Oh, where was that web comic that one time about a developer spending 8 years tuning programs for 256kb local memory and then not knowing what to do with themselves after, lol?
 
If you carefully hand tune memory layouts and access like you had to to maximize an SPE for example, it definitely helps everything, just not as much as something with nearly null branch prediction like an SPE. Same as writing tiny micro programs that fit in a 256kb local memory ends up helping a cache.

Oh, where was that web comic that one time about a developer spending 8 years tuning programs for 256kb local memory and then not knowing what to do with themselves after, lol?
I've seen many devs saying that Cell coding discipline helped them a lot with GPU compute.
 
I never understood how people forgave MS for the RROD. I believe it was something like 92% chance your xbox360 (before the slim) is prone to RROD. I got it, it burned through my motherboard and they told me I could sell it for parts or pay for another motherboard which was like 2/3 the price. And yes I lived in a country where the extended 3 yr warranty did not apply. It was just bonkers that MS got away with it buy giving extra warranty after which some peoples consoles would RROD for the third or 4th time.
 
I never understood how people forgave MS for the RROD. I believe it was something like 92% chance your xbox360 (before the slim) is prone to RROD. I got it, it burned through my motherboard and they told me I could sell it for parts or pay for another motherboard which was like 2/3 the price. And yes I lived in a country where the extended 3 yr warranty did not apply. It was just bonkers that MS got away with it buy giving extra warranty after which some peoples consoles would RROD for the third or 4th time.
It was more like 33-50 percent but the ps3 ylod was a big problem too.

All launch 360 and ps3s are doomed eventually.

People forgave 360 because the games were there and it was a graphical monster for 2005
 
Top Bottom