• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

Skeff

Member
1) What the hell does being a "Junior Member" have to do with a person's ability to have valid points and opinions? Didn't know "Junior Member" = ignore list.

2) Suggesting the Digital Foundry interview as "MS PR" is subjective at best. Should I suggest that any future Sony-related interviews are strictly "Sony PR" and all information within should be considered invalid?

3) When did NeoGaf = the rest of the world?

Your still not addressing the points raised.

I suggest you look at Brad Grenz post first regarding the difference in the necessary use cases first and then my post regarding the ways in which the developer interaction with esram is completely different to the edram.
 
1) What the hell does being a "Junior Member" have to do with a person's ability to have valid points and opinions? Didn't know "Junior Member" = ignore list.

2) Suggesting the Digital Foundry interview as "MS PR" is subjective at best. Should I suggest that any future Sony-related interviews are strictly "Sony PR" and all information within should be considered invalid?

3) When did NeoGaf = the rest of the world?

If people on Gaf believe what Sony is saying is bullshit, then they will label it as such. You best believe that
 

vcc

Member
As a former biology/psychology double-major does that now give me the right to have my interpretation of what "evolution" means?

No. Psychology? that's not even a real science. I kid. I finished off with a specialization in computer science and technically a minor in genetics.

I'm not talking about the evolution of species here, I'm talking the evolution of technology, which in many ways is a different interpretation. Evolution (as you know as a former genetics major) relative to species has more to do with adaptation, survival, and the ability to pass those genes to future generations. Evolution in technology also has similar principles, but is mainly used loosely as another way to describe the improvements of tech from past to present.

I was being pedantic about evolution. But MS did layout the 'environment' which the XB1 would evolve and their criteria was cost and market positioning based rather than Game production based. So while their ESRAM solution is a evolution from the EDRAM one it's based on the selective pressure of Kinect, apps, snap, and Windows 8 Marketplace rather than making better games.

And "relativity" was not originally the argument some here were making when they attempted to pick apart my initial statements (the adjustment to a "relativity" argument against my statements by some has just started to surface) . If someone wanted to say that the PS4's memory architecture is easier to grasp, that's all they have to say, "the PS4's memory architecture is easier to develop for." The problem is people start suggesting "the Xbone memory architecture is a bitch to develop for" (which was originally mentioned to me by Sword of Doom who said "a developer" said it in the EDGE article). Look, figuring out the answer to 2+2 compared to 2x12 is definitely easier (and 2x12 is more "difficult" to figure out compared to 2+2), but in the grand scheme of things they're both pretty easy to figure out.

There are ways where the ESRAM set up for the XB1 is harder than the EDRAM set up for the 360. It's not the same set up again. Conceptually it's similar but now it has to be used optimally vs any use being a free bonus. That's the other aspect we spoke about. For the 360 any use was gravy for the XB1 it must be used optimally. That's the other aspect to our arguments on top fo the implementation detail complications we alluded to.
 

shem935

Banned
1) What the hell does being a "Junior Member" have to do with a person's ability to have valid points and opinions? Didn't know "Junior Member" = ignore list.

2) Suggesting the Digital Foundry interview as "MS PR" is subjective at best. Should I suggest that any future Sony-related interviews are strictly "Sony PR" and all information within should be considered invalid?

3) When did NeoGaf = the rest of the world?

1. It doesn't but when you are arguing with a developer and several other veteran members who are very knowledgeable on the subject and you need more than general claims.

2. It is fine and dandy to put stock in PR articles but when the PR putting out the statement has been caught in lies and misinformation before why would you believe them?

3. Cause that is totally what he meant
 
No. Psychology? that's not even a real science. I kid. I finished off with a specialization in computer science and technically a minor in genetics.



I was being pedantic about evolution. But MS did layout the 'environment' which the XB1 would evolve and their criteria was cost and market positioning based rather than Game production based. So while their ESRAM solution is a evolution from the EDRAM one it's based on the selective pressure of Kinect, apps, snap, and Windows 8 Marketplace rather than making better games.



There are ways where the ESRAM set up for the XB1 is harder than the EDRAM set up for the 360. It's not the same set up again. Conceptually it's similar but now it has to be used optimally vs any use being a free bonus. That's the other aspect we spoke about. For the 360 any use was gravy for the XB1 it must be used optimally. That's the other aspect to our arguments on top fo the implementation detail complications we alluded to.

Exactly
 

viveks86

Member
1) What the hell does being a "Junior Member" have to do with a person's ability to have valid points and opinions? Didn't know "Junior Member" = ignore list.

2) Suggesting the Digital Foundry interview as "MS PR" is subjective at best. Should I suggest that any future Sony-related interviews are strictly "Sony PR" and all information within should be considered invalid?

3) When did NeoGaf = the rest of the world?

Since the others haven't acknowledged all your 3 points yet, let me be the first.

1) Yes, people should not have brought up your "Junior" status. That is not cool. I'd expect more from Gaf. Having said that, does it look like people are ignoring you? It seems like the other way around.

2) People are not suggesting that the DF article is MS PR just because it is a DF article about the Xbox. They are suggesting it is MS PR because the claims made in there don't really make much sense. If there is an article on a similar website about Sony that doesn't make any sense either, Gaf will call it out.

3) NeoGaf is not equal to the rest of the world. But it represents a diverse population with extremely varied opinions that cannot be equated to a hive mind (not implying that you are saying that). Gaf is, at the minimum, representative of the world of gaming enthusiasts, who are really the ones interested in having this conversation anyway.

Now that you have seen a complete response to your points, how about responding to several valid replies that you have received?
 

LilZippa

Member
As a former biology/psychology double-major does that now give me the right to have my interpretation of what "evolution" means? I'm not talking about the evolution of species here, I'm talking the evolution of technology, which in many ways is a different interpretation. Evolution (as you know as a former genetics major) relative to species has more to do with adaptation, survival, and the ability to pass those genes to future generations. Evolution in technology also has similar principles, but is mainly used loosely as another way to describe the improvements of tech from past to present.

So let's say tech A improves on the old process while tech B jumps to a new method. After multiple groups check out tech A they say it seemed like a improvement over previous tech A, but tech B seems far more advanced with no foreseeable disadvantages.

Which has a better advantage from nothing other than tech?

My point being guided evolution of tech relies on the vision of the company doing the guiding.
 
He makes some fantastic points, but I feel game content below 1080p scales very, very well on my 1080p television. I think it really comes down to the quality of the scaler in a person's television, or the quality of the scaler in the system itself.

My HDTV is native 1080p, but games like Halo 4, Uncharted 2, GTA V, and a multitude of others, all look quite amazing on it. Pretty much the only issue I ever run into is not being able to see the text on the user interface as well I'd would like. My go to examples to point this out currently are reading text messages on the cell phone in GTA V, or trying to get a good look at where cops are on the mini-map when trying to escape from them. So, my main argument is if you have a native 1080p television right now, and don't have any major complaints about 720p games on the 360 or PS3, why then would already non-existent issues pertaining to scaling potentially be exacerbated by the more powerful Xbox One, which will produce much superior graphics quality at even higher resolutions than the 360 or PS3?



Rather than telling me to stop, just accept that you and I have very different views. Although, in fairness, I do have the 360 version of Red Dead Redemption. I don't really know what the PS3 version of that game is like, but a major flaw in how we sometimes view things, I think, is that we readily accept that, due to the architectural similarities between the two systems this time around, the performance gap between them is even more telling or significant than was the case for whatever gap existed between the 360 and PS3. That we all accept.


However, at the very same time, we don't seem to readily acknowledge the fact that the Xbox One architecture is likely nowhere nearly as difficult for programmers to extract performance from as the PS3 was, which means there's a very high probability we also won't be seeing any Xbox One bayonetta style porting disasters on the system, which may significantly combat the "wow, this game is so unplayable on the Xbox One" complaints, like you see some people say about PS3 versions of certain multi-platforms. There is no cell processor on the Xbox One. ESRAM may be a challenge, but it isn't PS3 architecture levels of challenging, at least certainly not from my understanding. So, really, the PS4 versions of multi-platforms will no doubt likely be superior, but just don't be surprised if the Xbox One version isn't absolute "shit" in terms of graphics quality and framerate. Many of the things gamers had to deal with on the current gen systems won't apply this time around. Solid to great AF with some manner of AA seems a given, along with high quality textures and shaders. I mean, we'll have to see, but I get the feeling the "ugly duckling" version this upcoming gen won't be quite so ugly compared to what we might have seen on the 360 and PS3.

I'm sorry I kind of flew off the rails when I read your post. But I told you to stop because you don't even own a PS3 yet you make a comment saying PS3 owners are completely ok with worse looking/performing games. And that is absolutely false.
 
As a former biology/psychology double-major does that now give me the right to have my interpretation of what "evolution" means? I'm not talking about the evolution of species here, I'm talking the evolution of technology, which in many ways is a different interpretation. Evolution (as you know as a former genetics major) relative to species has more to do with adaptation, survival, and the ability to pass those genes to future generations. Evolution in technology also has similar principles, but is mainly used loosely as another way to describe the improvements of tech from past to present.

And "relativity" was not originally the argument some here were making when they attempted to pick apart my initial statements (the adjustment to a "relativity" argument against my statements by some has just started to surface) . If someone wanted to say that the PS4's memory architecture is easier to grasp, that's all they have to say, "the PS4's memory architecture is easier to develop for." The problem is people start suggesting "the Xbone memory architecture is a bitch to develop for" (which was originally mentioned to me by Sword of Doom who said "a developer" said it in the EDGE article). Look, figuring out the answer to 2+2 compared to 2x12 is definitely easier (and 2x12 is more "difficult" to figure out compared to 2+2), but in the grand scheme of things they're both pretty easy to figure out.

Please read the OP, there is the Edge article link posted and you can read the article yourself. That is what that developer said. And the relative argument is what people said from the beginning. I was one of the first people who replied to your post many pages ago. I don't even know what the point of all of this is anymore. I am not a tech person and for me it's simple. The XB1 is harder to develop for relative to the PS4. The posters who are alot more knowledgeable have gone through great detail explaining to you exactly why they believe the eSRAM is more difficult to develop for. Yet you offer nothing technical or rational except for referencing the DF article. So you are absolutely convinced that the DF article is true, then why bother arguing about it?
 

BigJoeGrizzly

Neo Member
Since the others haven't acknowledged all your 3 points yet, let me be the first.

1) Yes, people should not have brought up your "Junior" status. That is not cool. I'd expect more from Gaf.

2) People are not suggesting that the DF article is MS PR just because it is a DF article about the Xbox. They are suggesting it is MS PR because the claims made in there don't really make much sense. If there is an article on a similar website about Sony that doesn't make any sense either, Gaf will call it out.

3) NeoGaf is not equal to the rest of the world. But it represents a diverse population with extremely varied opinions that cannot be equated to a hive mind (not suggesting that you are saying that). Gaf is, at the minimum, representative of the world of gaming enthusiasts, who are really the ones interested in having this conversation anyway.

Now that you have seen a complete response to your points, how about responding to several valid replies that you have received?

I've been constantly responding to several different replies for a bit now. There are initial quirks to any type of hardware or software that must be figured out. I have not once suggested that there isn't a learning curve for developers of BOTH platforms (and ALL platforms actually). There have been two points I have made without wavering:

1) Will the majority of gamers over time really care THAT much about hardware superiority when they are experiencing awesome looking, fun games from both platforms?

2) Taking the word of a MS software engineer who has worked on the 360 and now the Bone, I don't see how the Bone's current memory architecture could be considered a "bitch" or "difficult" to develop for. Yes, Sony has the better memory setup, and it will be "easier" for developers to grasp, but something being easier than another thing doesn't make that other thing "difficult."

Simple as that. I've read everyone's explanations, and the theme of those explanations is that there is a learning curve to unleash the full capabilities of the Xbone's memory setup. That's something that ANYONE who knows a thing about software and hardware development can tell you. There's going to be a learning curve for the PS4 memory tech as well, just not as steep. I knew presenting these arguments would cause a stir, but I rather do that than the easy alternative of simply saying "The PS4 is gonna shit all over the Xbone, DEAL WITH IT."
 

Skeff

Member
I've been constantly responding to several different replies for a bit now. There are initial quirks to any type of hardware or software that must be figured out. I have not once suggested that there isn't a learning curve for developers of BOTH platforms (and ALL platforms actually). There have been two points I have made without wavering:

1) Will the majority of gamers over time really care THAT much about hardware superiority when they are experiencing awesome looking, fun games from both platforms?

2) Taking the word of a MS software engineer who has worked on the 360 and now the Bone, I don't see how the Bone's current memory architecture could be considered a "bitch" or "difficult" to develop for. Yes, Sony has the better memory setup, and it will be "easier" for developers to grasp, but something being easier than another thing doesn't make that other thing "difficult."

Simple as that. I've read everyone's explanations, and the theme of those explanations is that there is a learning curve to unleash the full capabilities of the Xbone's memory setup. That's something that ANYONE who knows a thing about software and hardware development can tell you. There's going to be a learning curve for the PS4 memory tech as well, just not as steep. I knew presenting these arguments would cause a stir, but I rather do that than the easy alternative of simply saying "The PS4 is gonna shit all over the Xbone, DEAL WITH IT."

A lot of words but no actual rebuttal, I have explained how the esram and edram are clearly different using the exact quote you used, so please respond legitimately or GTFO.
 

Famassu

Member
No need to bring the PS4 into this specific debate we're having. I know the PS4 memory architecture is better and more efficient. I'm strictly talking about the Xbone's memory here, and how it compares to the previous generation. I want to know how something that is considered an evolution, and more capable than its previous iteration, make it automatically more difficult, and where people are all of a sudden getting the idea that the 360's memory architecture was difficult to develop for? I feel like some people here are just grasping for straws, just to make the Xbone's memory architecture seem THATT much more inferior to the PS4's method.
It can be a hindrance if it's not GOOD ENOUGH an evolution in regards to the overall needs of the hardware and needs of the software that is going to be developed for the platform. I don't see what's so hard to understand. For Xbox 360, the amount & speed of eDRAM was a benefit and it didn't add too much in the ways of complexity (as in, it was easy enough to use). For Xbone, the eSRAM is a hindrance because there's not enough of it + there's the added negative side that it makes development more complex than on its competitors (or Xbox 360's eDRAM was). Xbone's eSRAM is more of a bottleneck (to counter an even bigger bottleneck) whereas Xbox 360's eDRAM was a benefit.

I mean, PS3 could be argued to have been more powerful than Xbox 360 this gen but due to the complexity of its hardware you really can't see that in the multiplatforms (in fact, it's the opposite, PS3 versions suck, in most cases). This time Xbone is not only a much weaker console than PS4 (a much bigger gap than PS3 vs. Xbox 360), it's also the harder one to develop for, meaning unless devs really put some effort into their ports, the gap might seem even wider than the raw power difference would imply. If devs put out as half-assed ports of multiplatform games for Xbone as they did for PS3, then it won't be a pretty sight.
 
I've been constantly responding to several different replies for a bit now. There are initial quirks to any type of hardware or software that must be figured out. I have not once suggested that there isn't a learning curve for developers of BOTH platforms (and ALL platforms actually). There have been two points I have made without wavering:

1) Will the majority of gamers over time really care THAT much about hardware superiority when they are experiencing awesome looking, fun games from both platforms?

2) Taking the word of a MS software engineer who has worked on the 360 and now the Bone, I don't see how the Bone's current memory architecture could be considered a "bitch" or "difficult" to develop for. Yes, Sony has the better memory setup, and it will be "easier" for developers to grasp, but something being easier than another thing doesn't make that other thing "difficult."

Simple as that. I've read everyone's explanations, and the theme of those explanations is that there is a learning curve to unleash the full capabilities of the Xbone's memory setup. That's something that ANYONE who knows a thing about software and hardware development can tell you. There's going to be a learning curve for the PS4 memory tech as well, just not as steep. I knew presenting these arguments would cause a stir, but I rather do that than the easy alternative of simply saying "The PS4 is gonna shit all over the Xbone, DEAL WITH IT."

I misquoted the developer from the Edge article. He actually says “Xbox One is weaker and it’s a pain to use its ESRAM,” . Here is a link to the article in the OP
http://www.edge-online.com/news/pow...erences-between-ps4-and-xbox-one-performance/

A lot of words but no actual rebuttal, I have explained how the esram and edram are clearly different using the exact quote you used, so please respond legitimately or GTFO.

Don't bother anymore man. To think you spent all this time trying to explain and he's not even listening
 
I've been constantly responding to several different replies for a bit now. There are initial quirks to any type of hardware or software that must be figured out. I have not once suggested that there isn't a learning curve for developers of BOTH platforms (and ALL platforms actually). There have been two points I have made without wavering:

1) Will the majority of gamers over time really care THAT much about hardware superiority when they are experiencing awesome looking, fun games from both platforms?

2) Taking the word of a MS software engineer who has worked on the 360 and now the Bone, I don't see how the Bone's current memory architecture could be considered a "bitch" or "difficult" to develop for. Yes, Sony has the better memory setup, and it will be "easier" for developers to grasp, but something being easier than another thing doesn't make that other thing "difficult."

Simple as that. I've read everyone's explanations, and the theme of those explanations is that there is a learning curve to unleash the full capabilities of the Xbone's memory setup. That's something that ANYONE who knows a thing about software and hardware development can tell you. There's going to be a learning curve for the PS4 memory tech as well, just not as steep. I knew presenting these arguments would cause a stir, but I rather do that than the easy alternative of simply saying "The PS4 is gonna shit all over the Xbone, DEAL WITH IT."

Eh personally I actually think that there is a high likelihood MS will improve the SDK specifically how the ESram processes work and that should make programming on it much easier

I don't think that it's as difficult as the cell was to program on but nor do I think there is going to be any real potential to be unlocked with it

However the statement by Goosen about the ESram in no way comments on complexity of programming for ESram and if anything would suggest a higher level of complexity programming is required

We have had numerous articles detailing the ease of porting onto the PS4 with no positive qualms about developing on the XB1. The only part of the architecture that in my opinion could be causing these hiccups for developers is either the SDK as it stands or the ESram or both. MS will improve their SDK I'm sure but if they don't adequately automate the ESram processes, it may always require far more finesse
 

vcc

Member
Simple as that. I've read everyone's explanations, and the theme of those explanations is that there is a learning curve to unleash the full capabilities of the Xbone's memory setup. That's something that ANYONE who knows a thing about software and hardware development can tell you. There's going to be a learning curve for the PS4 memory tech as well, just not as steep. I knew presenting these arguments would cause a stir, but I rather do that than the easy alternative of simply saying "The PS4 is gonna shit all over the Xbone, DEAL WITH IT."

I'd agree and say that despite everything the XB1 will still stress programmers out less than the PS2 or PS3. I recall a chat with some bioware people that MDK on the PS2 was such a painful ordeal that it turned them off Sony production for the entire generation. PS3 pioneer programmers likely had it as bad or worse.

I think Xbox is in a very unfortunate position and they'll need to dump the hubris and start taking their competition seriously before it's too late and we mourn the Xbox one like we mourn the Sega Saturn.
 
Firstly, you just highlighted some issues that probably will not go away with upscaling xbox one games. Is it not reasonable for people to want those issues (however minor you think they are) go away with a new generation?

Secondly, you have assumed there are no major complaints with upscaling 720p on PS3 and 360. They might not be major complaints for you and I. But high-end PC gamers who are used to impeccable image quality will consider it a major complaint when they play console exclusives. I'm sure you would agree that image quality is quite crappy for many of the top-end games that push the boundaries of the hardware. So for people who care about image quality, don't you think those are valid complaints? Sure you can anti-alias the heck out of it with additional horsepower on next gen, but as SPE pointed out, they would still not be as good as native resolution.

I think 3 display planes, each capable of running at their own independent resolution and even framerate, says we can most likely kiss that problem goodbye on Xbox One. In other words, even if a game is rendering at native 900p, the UI is more than likely to still be running at 1080p native. That's one of the primary reasons the display planes exist on the Xbox One, to solve this problem. I'd be willing to go out on a ledge and bet that the UI in Ryse is native 1080p, even though the game itself is rendering at native 900p.

I also agree. It is absolutely reasonable for people to want other issues, however minor they may be to me, to be dealt with in a new generation, and I honestly believe they will be by the power of these new systems, even the Xbox One. I may be in small company here, but the best looking games on both systems are games I personally have zero complaints about as it pertains to image quality. Games like Halo 4, Uncharted 2 and 3, God of War 3, GTA V, the Batman Arkham games, Heavy Rain etc, are all games I think look really damn good. Alan Wake is one of the lowest resolution exclusives on either system, and I thought that game looked fantastic. I believe it had 4XMSAA also, which probably helped a great deal, no doubt, but the lighting, the environments, the atmosphere, the impressive use of particles, It's an amazing looking game, but maybe I'm just not as picky or discerning on these things as others. Even so, surely there are enough gamers on here with pretty solid, or even excellent, 1080p HDTVs who agree how amazing these games look? Don't get me wrong, I've seen some HDTVs that look just abysmal at anything less than 1080p, and in such circumstances, native 1080p may indeed be absolutely crucial for those folks, and I guess I do have to be pretty mindful of that. Westinghouse models are some big culprits from what I've seen. And even an older Toshiba Regza model I had, which was quite good, gets roundly spanked by my newer Samsung with regards to quality scaling at different resolutions.

Sometimes it really does boil down to the quality of the television or scaler on the television itself, and, really, that can be a hit or miss thing, especially if you're on a budget, and need to pick out the most affordable thing possible at the time, but I've always been careful to be really patient and just save up for as long as was necessary for just the right HDTV for my needs. I bought ones I understood to be of lesser quality for family or even friends for when a specific level of quality wasn't really the big focus, but I've always done my research to make sure what I got for myself was nothing short of the best, or was at least the best when it was new.

And so maybe there are bigger factors for why I don't have complaints about these titles, or why I'm less concerned about games not being 1080p native on the new systems. Only one I have a big enough complaint about is the UI elements such as cell phone text or the mini-map for spotting cops in GTA V, but that's one of the primary issues I expect to see addressed on the Xbox One, and I don't expect it will be an issue on the PS4 either, since most things will be native 1080p or close to it anyway.

I'm sorry I kind of flew off the rails when I read your post. But I told you to stop because you don't even own a PS3 yet you make a comment saying PS3 owners are completely ok with worse looking/performing games. And that is absolutely false.

Whoa, hold the phone, junior. I absolutely own a PS3. Just because I don't own the ps3 versions of specific games doesn't mean I don't own the system. I quite regularly talk about the PS3 games that I own and enjoy the most on this forum. I'm one of the people that bought one for $600 at launch.
 

viveks86

Member
I've been constantly responding to several different replies for a bit now. There are initial quirks to any type of hardware or software that must be figured out. I have not once suggested that there isn't a learning curve for developers of BOTH platforms (and ALL platforms actually). There have been two points I have made without wavering:

1) Will the majority of gamers over time really care THAT much about hardware superiority when they are experiencing awesome looking, fun games from both platforms?

2) Taking the word of a MS software engineer who has worked on the 360 and now the Bone, I don't see how the Bone's current memory architecture could be considered a "bitch" or "difficult" to develop for. Yes, Sony has the better memory setup, and it will be "easier" for developers to grasp, but something being easier than another thing doesn't make that other thing "difficult."

Simple as that. I've read everyone's explanations, and the theme of those explanations is that there is a learning curve to unleash the full capabilities of the Xbone's memory setup. That's something that ANYONE who knows a thing about software and hardware development can tell you. There's going to be a learning curve for the PS4 memory tech as well, just not as steep. I knew presenting these arguments would cause a stir, but I rather do that than the easy alternative of simply saying "The PS4 is gonna shit all over the Xbone, DEAL WITH IT."

Ok. Again, I would really recommend you look at Brad Grenz's post and then respond to people like Skeff who have specifically addressed the points you have made. If you think you disagree, then present it. If you agree, then acknowledge it. It's the right thing to do.

Now going back to your post:

1) As much as gamers like their games, they are consumers as well. And they expect value for the money they give. If there is a sense that people are getting ripped off for the price they pay, then there would be frustration. Doesn't mean all of them would stop enjoying games, but it would spoil the brand that MS is trying so hard to build. This isn't a one lap race. The gamer that bought the Xbox One would not buy the next one (Xbox Two?) if he/she realizes over time that MS isn't giving him/her similar value comparable to direct competition.

2) We are talking about people here that are claiming the ESRAM is, in fact, difficult to develop for (not just in relative terms) and they have presented their case explaining why. What response do you have for them? It's not difficult? If so, how would you know?

And finally, good that you are aware of what kind of responses you would get. I tried to play the devil's advocate just yesterday and got some similar reactions. But I also got some very insightful comments. I'd suggest you address or acknowledge them directly.
 

BigJoeGrizzly

Neo Member
A lot of words but no actual rebuttal, I have explained how the esram and edram are clearly different using the exact quote you used, so please respond legitimately or GTFO.

There are differences between eSRAM and eDRAM, but there are also similarities (which were noted in the DF interview). Because of those differences, there will be a learning curve, but that doesn't mean its a "bitch" either. I think one quote from "a developer" in ONE article should not carry so much weight that it is considered ultimate truth (as it seems to be the case for many of you). I need more proof than simply one unnamed developer source.
 
I'm not so sure, the 360 cleanly beat the PS3 in attach rate partly because for the majority of it's lifespan the 360 had the better version of important multi-platform games (GTA4, COD, BF3, etc...).

People like the notion of value, and $100 cheaper for better performance is a major selling point and the idea hit the mainstream. It was a simple easy to explain idea and it was news. The news loves that sort of narrative; a industry giant brought low by it's hubris. It's they narrative they wrote about Atari, Nintendo, and Sony. Now Microsoft and the narrative will inform purchase decisions.

We'll see how it all turns out but MS needs to cut out the 'we're not worried' position and needs to start taking their adversaries more seriously. If they bow out of the industry it isn't a good thing for the consumer.

I honestly feel the amazing attach rate advantage the 360 had was more due to being the more affordable system, and having a year long head start on the competition. Microsoft picked up all or most of the more rabid, hardcore gamers that honestly couldn't wait to get started on the next generation of games, and that benefitted the 360 all generation long from the looks of it. I don't speak too much from the perspective of PS3 multi-platform games, because most of my multi-platform games were purchased on the xbox 360, with the exception of FF13. The PS3 was pretty much my secondary system that was mostly for big exclusive game purchases that I couldn't get on the 360. The 360 was the primary system where I bought all the multi-platforms on, but that was for reasons far exceeding better multi-platform versions. All or most of my friends were gaming on 360, and the 360 was my favorite system personally, so I wanted to own all the biggest games specifically for that console, It was a hard habit to kick. To buy a multi-platform on the ps3 felt like I was completely starting fresh and abandoning the community of people that I interacted with over on the 360 side of things.

That's why I got most multi-platforms on the 360, and why I continue to do so. I read that GTA V is better on the PS3, and I still bought it for the 360, but then I had long since pre-ordered for the 360, and I wasn't going to cancel the order, but hearing that the PS3 version was better,and didn't have any of the issues that the 360 version had with installations and such, it still did nothing to dissuade me from the 360 version for the reasons I listed above. Sony's price more or less guarantees they'll probably stomp MS in the attach rate category this time around.
 

otapnam

Member
I honestly feel the amazing attach rate advantage the 360 had was more due to being the more affordable system, and having a year long head start on the competition. Microsoft picked up all or most of the more rabid, hardcore gamers that honestly couldn't wait to get started on the next generation of games, and that benefitted the 360 all generation long from the looks of it.

Halo
Cod
Madden
Live
 
Halo
Cod
Madden
Live

Well, Live was apparently a factor also for a lot of folks. I personally was never a gold subscriber though, only ever taking advantage of free trials. The Xbox One will be the first time I commit to the service in any serious way. If Sony launches the ps3 a year in advance, and with a better price, the 360 would've been dead in the water, probably wouldn't have seen such strong developer support, particularly in Japan, and Sony would have the superior attach rate. I genuinely believe this.
 

otapnam

Member
Well, Live was apparently a factor also for a lot of folks. I personally was never a gold subscriber though, only ever taking advantage of free trials. The Xbox One will be the first time I commit to the service in any serious way. If Sony launches the ps3 a year in advance, and with a better price, the 360 would've been dead in the water, probably wouldn't have seen such strong developer support, particularly in Japan, and Sony would have the superior attach rate. I genuinely believe this.

Probably should add the year out earlier and easier development part.

I think what alot of people are really interested about this time is the whole process. Like how the companies designed their systems, marketing, engines, etc. That's what we're all reading about in this thread anyways. And the payoff is going to come in less tthen months when we get the final launch product
 
Probably should add the year out earlier and easier development part.

I think what alot of people are really interested about this time is the whole process. Like how the companies designed their systems, marketing, engines, etc. That's what we're all reading about in this thread anyways. And the payoff is going to come in less tthen months when we get the final launch product

better controller, Xbox Live, cross party chat.....
 

mrklaw

MrArseFace
Can we just imagine Xbox one has roughly the same effective bandwidth - if programmed carefully - as PS4? Then we can move the discussion on to the more concrete areas like 50% more CUs, 100% more ROPs etc. I'd be interested to know how (assuming similar effective bandwidth), each system would be able to utilise and keep fed those other elements
 

mrklaw

MrArseFace
I wonder how different it might have been if MS had gone for edram on a daughter die? They'd have been able to out more on (64-128MB), had faster bandwidth, and freed up space on the APU for more CUs. Could have been on par with PS4. But they went with esram I guess mainly for cost reduction possibilities later in the generation, which could bite them on the ass
 
Even so, surely there are enough gamers on here with pretty solid, or even excellent, 1080p HDTVs who agree how amazing these games look?
The thing is that "looking amazing" is a moving target. Uncharted looked amazing...but not so much when compared to Uncharted 2. My old 42" 1024x768 TV looked amazing...but not so much when I upgraded to a 60" 1080p one.

Contrary to what many people say, wanting better hardware to enable better graphics isn't some tangential pursuit. It is inextricably part of the desire for better games. There will be good-looking games everywhere; there can be great games on weak hardware; fun isn't driven by pixel counts or AA levels; many will be satisfied with lesser graphics, and more still will tolerate them--to all of that, let's accede.

But settling for more constrictive parameters and a more limited goal from the beginning is defeatist. This is my entertainment, which (when good) I enjoy deeply. Story, gameplay, fun, graphics--I want it all! And since many others do too, you shouldn't be surprised or dismissive when they're passionately disappointed by underachievement.
 

mrklaw

MrArseFace
He makes some fantastic points, but I feel game content below 1080p scales very, very well on my 1080p television. I think it really comes down to the quality of the scaler in a person's television, or the quality of the scaler in the system itself.

My HDTV is native 1080p, but games like Halo 4, Uncharted 2, GTA V, and a multitude of others, all look quite amazing on it. Pretty much the only issue I ever run into is not being able to see the text on the user interface as well I'd would like. My go to examples to point this out currently are reading text messages on the cell phone in GTA V, or trying to get a good look at where cops are on the mini-map when trying to escape from them. So, my main argument is if you have a native 1080p television right now, and don't have any major complaints about 720p games on the 360 or PS3, why then would already non-existent issues pertaining to scaling potentially be exacerbated by the more powerful Xbox One, which will produce much superior graphics quality at even higher resolutions than the 360 or PS3?

I agree with you on this. For me at least, 720p games scale nicely on my TV. Adding a 1080p HUD through use of the display planes will help even more. Of course I'd prefer native 1080p, but I could live with 720-900

Hypothetically, if Xbox one could handle all games with an equivalent detail level to Ps4, just at 900p instead of 1080p, that would be an outcome for MS I think. And it may be that Multiplatform games go that way.

However I do think there is too much of a performance gap, and certainly first party games on Ps4 will simply be doing things that simply cannot be done on Xbox one without significant compromise.
 

sol_bad

Member
My HDTV is native 1080p, but games like Halo 4, Uncharted 2, GTA V, and a multitude of others, all look quite amazing on it. Pretty much the only issue I ever run into is not being able to see the text on the user interface as well I'd would like. My go to examples to point this out currently are reading text messages on the cell phone in GTA V, or trying to get a good look at where cops are on the mini-map when trying to escape from them. So, my main argument is if you have a native 1080p television right now, and don't have any major complaints about 720p games on the 360 or PS3, why then would already non-existent issues pertaining to scaling potentially be exacerbated by the more powerful Xbox One, which will produce much superior graphics quality at even higher resolutions than the 360 or PS3?

So ............. have you tried hooking your extremely powerful PC up to your 1080p TV set and played games at 1080p. Have you seen the difference between a native 720p and a native 1080p game on your TV set?
I have my X360, PS3 and PC all hooked up to my 1080p plasma and as great as Uncharted 2, GTAV and The Last of Us are technically, their image quality takes a massive hit in comparison to my PC games at 1080p.
There is honestly no comparison and I'm excited about the PS4 to hopefully escap the poor picture quality of current consoles.
 

Daeva

Banned
So ............. have you tried hooking your extremely powerful PC up to your 1080p TV set and played games at 1080p. Have you seen the difference between a native 720p and a native 1080p game on your TV set?
I have my X360, PS3 and PC all hooked up to my 1080p plasma and as great as Uncharted 2, GTAV and The Last of Us are technically, their image quality takes a massive hit in comparison to my PC games at 1080p.
There is honestly no comparison and I'm excited about the PS4 to hopefully escap the poor picture quality of current consoles.

Agreed, people are seemingly up to the usual making excuses for upscaled games. 720p looks horrid on the vast majority of games to me. 900p while better is going to suffer the same fate in many ways. You will always be wondering what native res looks like if you are into graphics.

All this scaler talk is pretty bleh to me. Scaling does not make up for resolution in any way. I game on a high end 42 inch fully backlit LED and 720p just doesn't do it for me anymore though I can handle it in some games.
 

Bundy

Banned
And I remember Major Nelson passing off that sony event as nothing, posting pictures of MS guys watching it and suggesting that there was popcorn (iirc). The messaging from MS on that event made it seem like it didn't effect them at all. It made me interested in seeing what they had up their sleeves because of that, then I see TV TV Sports TV.
Yep!

2440294-0433203838-proxy.jpg


And do you remember this ----> "Victory will go not to those who make the most noise, but those who make the most impact".
I'm still laughing.....
 
I wonder how different it might have been if MS had gone for edram on a daughter die? They'd have been able to out more on (64-128MB), had faster bandwidth, and freed up space on the APU for more CUs. Could have been on par with PS4. But they went with esram I guess mainly for cost reduction possibilities later in the generation, which could bite them on the ass

except EDRAM is also expensive and would be harder to integrate come the next die shrink
 

mitchman

Gold Member
And I remember Major Nelson passing off that sony event as nothing, posting pictures of MS guys watching it and suggesting that there was popcorn (iirc). The messaging from MS on that event made it seem like it didn't effect them at all. It made me interested in seeing what they had up their sleeves because of that, then I see TV TV Sports TV.

And then we have this famous quote from that guy:

https://twitter.com/majornelson/status/304396492314128385

Announce a console without actually showing a console? That's one approach
 

Asherdude

Member
I'll take your word for it but all I know is when I came here to check out GAF's reaction (which I admit fits your time frame) Sony was always ahead in opinions.
GAFer's who didn't listen to rumors assumed that the Xbone would be more powerful. But, like we all know, rumors have a nasty habit of being true.
 

killatopak

Gold Member
This just in. According to misterxmedia's insider, Xbox One has an additional 4 old CU's to reserve the 10% GPU for the Kinect so its TFLOPs is still unchanged....


LOLOLOLOLOLOLOL!

Oh gosh. I know it's bad to bring him up but every time a bad news about Xbox One gets reported he undoubtedly has an explanation for it. Now with Cboat's leaks, I don't know what to say anymore.


Would be nice if you could necro the thread to make those people ,saying the MS PR knows what they're doing, eat crow. I'm a junior though so it wouldn't be nice.
 
I thought this quote from MS was interesting.

He continued, “Our job is not to get lost in the industry chatter, but to ensure that the consumers going into stores, going online, still experience Xbox. The announcement is important for the industry, but it is also important to remember who is buying right now.
Exerting caution over looking too far into the future – and forgetting about current consumers as a result – Grimes added, “What has got us to where we are today is not going to be what gets us through the next ten years. Now we’ve also got to start from scratch when thinking about the future. Driving creativity, whether around marketing, retail or new business models, is key for us.”

It's exactly what they did. They abandoned core gamers and were suggesting Sony should do the same. I think that's what he meant??? Here's the original thread

http://m.neogaf.com/showthread.php?p=47704350
 

Skeff

Member
Can we just imagine Xbox one has roughly the same effective bandwidth - if programmed carefully - as PS4? Then we can move the discussion on to the more concrete areas like 50% more CUs, 100% more ROPs etc. I'd be interested to know how (assuming similar effective bandwidth), each system would be able to utilise and keep fed those other elements

But it doesn't...though if we were to take the premise that it did, It wouldn't really make a difference, it would still be limited by the size of the esram, unless we also imagined that the DDR3 ram was also faster to the point where it was feasible to put part of the G-Buffer in there without a noticable performance drop, then we'd be looking as 1080p more consistenly, But neither of those premises are real.

I wonder how different it might have been if MS had gone for edram on a daughter die? They'd have been able to out more on (64-128MB), had faster bandwidth, and freed up space on the APU for more CUs. Could have been on par with PS4. But they went with esram I guess mainly for cost reduction possibilities later in the generation, which could bite them on the ass

The only reason they didn't choose an edram daughter die is $$$$$$.
 
Top Bottom