• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

pushBAK

Member
http://reviews.cnet.com/lcd-monitors/samsung-syncmaster-2233rz/4505-3174_7-33499496.html

The better of my two pc monitor's top resolution is 1680x1050.

I don't really have a choice in the matter, but I also don't feel I'm missing very much. The games still look unbelievable at really high graphics settings.

Thats an apples v oranges comparison...it's not really a '1080p vs 900p' argument, its 'native res vs upsampled res' argument. If 1600x900 TVs were the norm, nobody would be complaining. But, the fact of the matter is 1920x1080 is the norm.
 
Eh? I'm not suggesting you need to dwell on others' disappointments. You seem to present your case repeatedly as if you expect others to be convinced. If all you care about is your own opinion and not others', then why bother repeatedly justifying your stand? All I'm trying to tell you is that people's thresholds are different. You can keep talking about your own, but it doesn't really add anything to the conversation if you aren't willing to understand how others look at it as well.

I'm not criticizing or attacking anyone else's view. I'm giving my view. Big difference.

Why build a great gaming machine and never upgrade your monitor?

Because money doesn't grow on trees, unfortunately, which is probably ironic for someone buying both systems at launch, but I have a list of priorities, and upgrading a monitor that I'm already more than satisfied with is at the very bottom of my list. I'd feel bad if I ended up upgrading my monitor for at least $200+ when I could've used that money towards a new graphics card, better processor, a bigger SSD, xbox one/ps4 purchases etc. My secondary monitor is a complete piece of shit, something I only started using when my Dell CRT crapped out on me, and I needed something to hold me over till I got this 120HZ Samsung. When I got my Samsung monitor, I decided, what the hell, I'll keep using my other crappy LCD, but as a second monitor. It serves its purpose, though.
 

lumzi23

Member
If you go back a year or so before the unveiling, MS were favoured in power just because of their last platforms and finances. But as we received more information the picture became more clear, and there were a lot of account suicides over MS losing the position of power leader.

Okay, but he should have at least included it in his post. Reading his version of events it sounds like Sony was the viciously down trodden underdog until their grand unveiling.

No, I think some one reliable (forget who it was) dropped something along the lines that the Xbone was a "Supah-computah".

Based on that there was a window (however small it was) where it was expected that the Xbone would continue in 360's reign.

See above.
 
If nothing else, they appear to have gone out of their way to avoid such a scenario this time around, so I somehow doubt another RROD scenario. A healthy bit of their engineering focus seems to have been on precisely that issue. On the 1080p front, yea, I can see developers quite regularly opting for resolutions lower than 1080p to provide themselves some extra freedom to do some more with their game, and I quite honestly prefer it that way. If they can hit 1080p, and still make the game look and run great, then I'm all for it, but I don't see anything wrong with resolutions lower than 1080p. I'd take a higher level of graphics quality over a higher resolution everytime. As long as we aren't dealing with sub 720p resolutions, which I honestly don't see happening. Unless you have the performance muscle to spare, the benefit of 1080p is simply greatly outweighed by making an even more impressive looking game at lower resolutions.

Take Quantum Break as an example, no way in hell do I want Remedy to limit their vision because they somehow feel they must run at 1080p. Go as low as they need to, as long as that doesn't mean sub Anything at or above, I'm totally fine with.

720p is your new standard? Going 1080p means limiting developer's vision? Less eye-candy doesn't matter? Wut now?
 
32mb esram is not big enough for many deferred rendering buffers.

For example Killzone shadowfall is around 47mb and BF3 on PC at 1080p was around 55mb, they are bigger than 32mb.

I don't remember the exact numbers for the buffers, but they're about right.

The PS4 does not need to struggle fit a buffer into memory as all 8gb of the PS4 memory is very high speed.

Reducing the same 55mb buffer from 1080p to 720p would likely make it lower than 32mb and fit in esram

This! With my limited tech knowledge this made the most sense to me. You either reduce the density of each pixel or you reduce the number of pixels


In regards to the ability to run 1080p, I would have figured that it would be more due to a disparity between the GPUs rather than the memory configuration. Aren't there current gen games that hit 1080p with the 360's current memory setup which uses a lot of the same principles as the One?

Also, I wouldn't think that 1080p is a barrier in and of itself, developers should be able to target 1080p, I would think that the distinction is what they have to give up in the texture quality and effects that is the trade-off as to why they wouldn't hit it.

Forza is an example of this
 

CLEEK

Member
That may be the reality of the situation, and I'm fine with that. I have a pretty powerful PC, and all my gaming is done at 1680x1050, and I think the new consoles are capable of a level of graphics performance now where a lower resolution doesn't exactly strike me as the end of the world. Look at the better looking 720p titles this generation. I didn't see very many complaints about those, and certainly high quality efforts on the Xbox One will easily top those, and should be almost certain to sport far better image quality.

There is one massive reason why hitting native 1080p is more important next gen than 720p vs sub-HD this gen. It's the native res of people's TVs. This gen, even if you had a 720p HDTV, it was highly unlikely your HDTV was 1280x720, so there was scaling going on.

You mention you play your PC games 1680x1050. For such an arbitrary resolution, I assume it's the native res of your monitor? Therefore the optimal native res a game can ever run at.

Just look at the Vita. The single biggest factor in IQ in its games is running at native 960x544. If non-native, it doesn't matter how many effects or quality of AA a game has, the image will always look cleaner and better at native res.

So it has nothing to do with trade offs between effects and performance. Just the simple fact that if you have a 1080p TV, native 1080p will always produce the best IQ, so should always be what devs target. It's fucking insane if the very design of the Xbone eSRAM prohibits this.
 

viveks86

Member
I'm not criticizing or attacking anyone else's view. I'm giving my view. Big difference.

That's the problem I'm trying to highlight, dear sir. I've seen so many posts from you where all you are saying is "This is what I care about. The other stuff doesn't matter." That's your preference. And that's fine. You can't present that as a valid argument. It is subjective.
 
720p is your new standard? Going 1080p means limiting developer's vision? Wut now?

Let me explain. The Xbox One isn't as powerful as the PS4. That is obvious. To target 1080p on the Xbox One means to naturally accept that you're going to do less ambitious things with your game graphically, because it doesn't exactly have the most raw power. If by going to a lower resolution, developers can make a more graphically impressive game, then I support that. No, 720p isn't my standard. It's just the bare minimum of what I'd be willing to accept on the xbox one. My preference for the Xbox One is 900p.

That's the problem I'm trying to highlight, dear sir. I've seen so many posts from you where all you are saying is "This is what I care about. The other stuff doesn't matter." That's your preference. And that's fine. You can't present that as a valid argument. It is subjective.

A bit of a gross oversimplication of what I usually say, but point taken. That said, anybody's view on this is subjective, I think. As others point out, any developer can target 1080p for their game. It's just what are they willing to give up in order to do so. It's obvious, and I don't think this part is subjective, that to target 1080p on the Xbox One, developers would have to give up much more than what they would have to give up on the PS4. For what developers would have to give up on the Xbox One, I don't feel 1080p is worth it. If I had a choice of Ryse at 1080p, but with the game doing less technically, compared to what we now have at 900p, I take the 900p version. Quantum Break is an example I like to use because, based on what they say they want to achieve, it's bound to be one of the more technically interesting or impressive games on the Xbox One, and so if the Xbox One simply doesn't have the horsepower to nail what they want to nail at 1080p, with the level of graphical fidelity they desire, then I support targeting a lower rendering resolution. I feel everything else they would change to make it 1080p would be an even bigger compromise than the resolution in the first place. Or maybe I'm underestimating what the system will be able to do at 1080p, and I don't doubt for a second that could be the case, but I'm okay with lower than 1080p resolutions because, from my experience, the games still look fantastic.
 

vcc

Member
PS3 is capable of delivering 1080p....just because it can, doesn't mean it was truly designed for it.

There are a number of limitations of the Xbox One hardware - ROPS, CU's, Bandwidth, that will make 1080p more difficult to achieve.

Ultimately I think we'll see more non-1080p games on Xbox One than 1080p titles. We already see that for launch titles...

Hence, I don't see how you can say it was designed for 1080p.

One of the reasons the PS3/360 never hit 1080p consistently isn't that they weren't designed for it. They were but the standards of Image Quality got high enough that the trade off for 1080p vs 720p with better IQ wasn't worth it.

Similarly the XB1 was designed for 1080p as was the PS4 but the image quality targets between the two was different and it seems the standard is higher than the XB1 can deliver. So they have to compromise resolution.

Unless you have the performance muscle to spare, the benefit of 1080p is simply greatly outweighed by making an even more impressive looking game at lower resolutions.

The problem for Microsoft is that it very much appears as if the PS4 has that extra muscle to spare. 900p vs 1080p is around 40% more pixels and the PS4 vs the XB1 has more than 40% more capacity in every part of the rendering pipeline the PS4 has the XB1 by 40% or more. The only exception is in raw memory bandwidth; if the ESRAM and DDR3 is being used together you get parity at around 175GB/s (the 200GB ESRAM number is incredibly dumb, it's a one off trick they projected into a impossible number.).
 
http://reviews.cnet.com/lcd-monitors/samsung-syncmaster-2233rz/4505-3174_7-33499496.html

The better of my two pc monitor's top resolution is 1680x1050.

I don't really have a choice in the matter, but I also don't feel I'm missing very much. The games still look unbelievable at really high graphics settings.



Well, that wouldn't be too surprising if it did. The PS4 is the clearly stronger system after all. The real question is just what does less eye candy on high quality xbox one releases actually mean? I suspect that people won't be too disappointed by what 343i does with the next Halo, or what Remedy does with Quantum Break. Look at what Crytek are already doing with Ryse at launch. If that's what lower resolution and less eye candy means for the system, I don't think xbox one gamers will exactly be scraping the very bottom of the graphics barrel, do you know what I am saying?

sp_1309_clip09.jpg

So to sum everything up, you're basically saying "It's good enough"
 

CLEEK

Member
I think this is what we call 'revising history.'

Even before the PS4 unveiling the rumors always supported PS4 as more powerful. And the fans have always capitalized on this. Lets keep this straight.

If you look at my post history, you will see I've been active in all the hardware rumour threads around the next gen consoles.

You're most certainly incorrect. All the smart money was on the Xbox 3 being more powerful.

This was true when the first hardware leaks started as well. It was only the tail end of last year (weeks before the reveal of the PS4) that the Orbis was edging out in front, and even then there was lots of talk about secret sause / dual GPUs / dual SoCs of the Durango that would mean it eclipsed the Orbis in power.
 

Ploid 3.0

Member
Ahh so 720-900p will be the norm while Forza remains at 1080p. Also the February event fucked up Microsoft's plans.

Thanks Butts.

And I remember Major Nelson passing off that sony event as nothing, posting pictures of MS guys watching it and suggesting that there was popcorn (iirc). The messaging from MS on that event made it seem like it didn't effect them at all. It made me interested in seeing what they had up their sleeves because of that, then I see TV TV Sports TV.
 

James Sawyer Ford

Gold Member
And I remember Major Nelson passing off that sony event as nothing, posting pictures of MS guys watching it and suggesting that there was popcorn (iirc). The messaging from MS on that event made it seem like it didn't effect them at all. It made me interested in seeing what they had up their sleeves because of that, then I see TV TV Sports TV.

It's a case of group-think mentality.

They were really confident in the Xbox One project, but it got to their heads because clearly a lot of people disagreed with the direction they went and were far more impressed with the PS4.
 

nib95

Banned
My mind keeps getting blown lately. SenjutsuSage despite all his tech talk, games on a 22" 900p monitor (wut), and KKRT00 with all his tech talk and graphical championing, games on a PC rig armed with a GTX 560 (double wut).

Hmm...

There is one massive reason why hitting native 1080p is more important next gen than 720p vs sub-HD this gen. It's the native res of people's TVs. This gen, even if you had a 720p HDTV, it was highly unlikely your HDTV was 1280x720, so there was scaling going on.

You mention you play your PC games 1680x1050. For such an arbitrary resolution, I assume it's the native res of your monitor? Therefore the optimal native res a game can ever run at.

Just look at the Vita. The single biggest factor in IQ in its games is running at native 960x544. If non-native, it doesn't matter how many effects or quality of AA a game has, the image will always look cleaner and better at native res.

So it has nothing to do with trade offs between effects and performance. Just the simple fact that if you have a 1080p TV, native 1080p will always produce the best IQ, so should always be what devs target. It's fucking insane if the very design of the Xbone eSRAM prohibits this.

Well put.
 
My mind keeps getting blown lately. SenjutsuSage despite all his tech talk, games on a 22" 900p monitor (wut), and KKRT00 with all his tech talk and graphical championing, games on a PC rig armed with a GTX 560 (double wut).

Hmm...



Well put.

He's been full of stinky shit ever since he joined. Nothing new.
 
This is just an 'in general' statement, but man, PC gamers really try to ruin all the fun this console generation. It's so boring. You can already get a PC, the consoles aren't out yet -- stop trying to embed the PC into every discussion, it's so tedious and obnoxious. It's so odd that the PC-only gamers act like they are trying to better others or educate people on how it is possible to game on a PC. I'm pretty sure everyone knows this. I never thought there would be a gaming community that out annoys even the most obnoxious of "fanboys".
 

Ashes

Banned
My mind keeps getting blown lately. SenjutsuSage despite all his tech talk, games on a 22" 900p monitor (wut), and KKRT00 with all his tech talk and graphical championing, games on a PC rig armed with a GTX 560 (double wut).

Hmm...

why does this blow your mind? console gamers go ga ga over graphics but play on er consoles. How many GT/kz2/uc gifs exist to show awesome gifs? Their view is common enough.

This is just an 'in general' statement, but man, PC gamers really try to ruin all the fun this console generation. It's so boring. You can already get a PC, the consoles aren't out yet -- stop trying to embed the PC into every discussion, it's so tedious and obnoxious. It's so odd that the PC-only gamers act like they are trying to better others or educate people on how it is possible to game on a PC. I'm pretty sure everyone knows this. I never thought there would be a gaming community that out annoys even the most obnoxious of "fanboys".

Especially considering that pcs are in no way monolithic. Some parts cost more than an entire console.
 

viveks86

Member
My mind keeps getting blown lately. SenjutsuSage despite all his tech talk, games on a 22" 900p monitor (wut), and KKRT00 with all his tech talk and graphical championing, games on a PC rig armed with a GTX 560 (double wut).

Hmm...

I smell another verbal brawl. Nib, you were just pulling his leg, right? Please say yes? :)
 
One of the reasons the PS3/360 never hit 1080p consistently isn't that they weren't designed for it. They were but the standards of Image Quality got high enough that the trade off for 1080p vs 720p with better IQ wasn't worth it.

Similarly the XB1 was designed for 1080p as was the PS4 but the image quality targets between the two was different and it seems the standard is higher than the XB1 can deliver. So they have to compromise resolution.



The problem for Microsoft is that it very much appears as if the PS4 has that extra muscle to spare. 900p vs 1080p is around 40% more pixels and the PS4 vs the XB1 has more than 40% more capacity in every part of the rendering pipeline the PS4 has the XB1 by 40% or more. The only exception is in raw memory bandwidth; if the ESRAM and DDR3 is being used together you get parity at around 175GB/s (the 200GB ESRAM number is incredibly dumb, it's a one off trick they projected into a impossible number.).

I don't disagree, but I also don't see it as a problem that will meaningfully show up to such an extent that people will feel like crap for even owning an xbox one. There will be amazing looking games on the Xbox One regardless of what the numbers are, and while I completely understand why these numbers would be such a very big deal around here, I don't feel it will matter to the broader audience of gamers that just want great looking games regardless of where they find them. As an example, despite the Xbox 360 regularly having the superior versions of multi-platform games, there isn't a thing you could tell the folks that bought the PS3 version of games that would convince them that they were somehow getting a less than stellar gaming experience. If the consoles are being viewed in a purely competitive light, and only ever under such pretenses, then the differences between the two systems is really all that matters. However, most gamers really don't see things that way. They get a system, they buy games, and then they enjoy themselves.

But then I suppose I totally understand how that view might seem pretty out of place in a thread primarily about the differences in power between the two systems, so I've probably said about as much as is necessary on the matter. Other people see things differently, and I respect that.
 

Skeff

Member
I just wrote a post regarding why the xbox one was sub 1080p due to 32mb esram i another thread and thought it would be useful in here as there was some discussion regarding why xb1 would be stuck lower than 1080 and what CBOAT could have meant:

I can't get into detail without giving up what I do, which I can't do. So I'll just concede. I honestly should have never brought that up if I could not reliably defend it. For that I apologize

Well insider card, see a mod or ban. But let's carry on anyway:

You can easily show us the math to show that a render target "easily fit's into 32mb" I'll show you my side of our disagreement:

it would almost definitely be 32 bit colour depth. We could make the case it wouldn't fit in 32mb depending on the techniques used, for example with 4xFSAA and 1080p we'd be looking at:

Back Buffer:
1920x1080 [Resolution] * 32 [Bits Per Pixel] * 4[FSAA Depth]
= 265420800 bits = 31.6MB

Depth Buffer:
1920x1080 [Resolution] * 32 [24Bit Z, 8Bit Stencil] * 4 [FSAA Depth]
= 265420800 bits = 31.6MB

Front Buffer:
1920x1080 [Resolution] * 32 [24Bit Z, 8Bit Stencil]
= 66355200 bits = 7.91MB

Total Memory Requirements:
31.6 + 31.6 + 7.91 = 71.11MB

It's not quite as simple as you put it.

Without the AA you'd be looking at around 24mb.

Though this method is becoming less and less used in games development:

That's not applicable at all to modern rendering techniques.

Skeff further up is closer but probably not familiar with deferred shading. When you are using deferred shading, you need to store all the input data for your later lighting calculations in your render buffer (usually called a g-buffer). The exact amount of memory you need to do this depends on the engine and what you are going for, but you're unlikely to get by with less than 16 bytes per pixel.


Quite.


Further to my comment on a previous page, I looked up the more exact figure for BF3's gbuffers @ 1080p/4xMSAA: 158MB.

http://www.slideshare.net/fullscreen/DICEStudio/shiny-pc-graphics-in-battlefield-3/20

You could work from that figure for 1080p/NoAA. Though BF4's may differ.

Without AA?

That works out to 80 bytes per pixel for 4xAA. Assuming it scales linearly with AA samples (which is an absolute worst case scenario) that would mean 20 bytes per pixel for no AA. Which would result in ~40 MB buffer size at 1080p.

If we were to look at Durante's minimum suggested of 16 Bytes per pixel we'd be looking at around 32mb which is about as small as you could go which would be 31.64mb.

This is essentially the MINIMUM you could get away with, Most games would require much higher.

It's techy but I'm sure some will enjoy the read.
 

viveks86

Member
There is one massive reason why hitting native 1080p is more important next gen than 720p vs sub-HD this gen. It's the native res of people's TVs. This gen, even if you had a 720p HDTV, it was highly unlikely your HDTV was 1280x720, so there was scaling going on.

You mention you play your PC games 1680x1050. For such an arbitrary resolution, I assume it's the native res of your monitor? Therefore the optimal native res a game can ever run at.

Just look at the Vita. The single biggest factor in IQ in its games is running at native 960x544. If non-native, it doesn't matter how many effects or quality of AA a game has, the image will always look cleaner and better at native res.

So it has nothing to do with trade offs between effects and performance. Just the simple fact that if you have a 1080p TV, native 1080p will always produce the best IQ, so should always be what devs target. It's fucking insane if the very design of the Xbone eSRAM prohibits this.

Well said. Senjutsu, this is how an argument should be made on the topic. Not "I don't mind quantum break being 720 or 900p". That doesn't work in a debate, because that's just your preference
 
And I remember Major Nelson passing off that sony event as nothing, posting pictures of MS guys watching it and suggesting that there was popcorn (iirc). The messaging from MS on that event made it seem like it didn't effect them at all. It made me interested in seeing what they had up their sleeves because of that, then I see TV TV Sports TV.

Dat arrogance!
 

nib95

Banned
why does this blow your mind? console gamers go ga ga over graphics but play on er consoles. How many GT/kz2/uc gifs exist to show awesome gifs? Their view is common enough.

It blows my mind because I wrongly expected people who were heavily devoted to tech talk and graphical fidelity to have invested somewhat in to extracting more in that regard. 22" 900p monitors and GTX 560's are hardly the epitome of high tech. It's no surprise for example, SenjutsuSage would care so little for 1080p if his monitor doesn't even offer the resolution.
 

Ashes

Banned
It blows my mind because I wrongly expected people who were heavily devoted to tech talk and graphical fidelity to have invested somewhat in to extracting more in that regard. 22" 900p monitors and GTX 560's are hardly the epitome of high tech. It's no surprise for example, SenjutsuSage would care so little for 1080p if his monitor doesn't even offer the resolution.

It's nice to want things I guess.
 
I don't disagree, but I also don't see it as a problem that will meaningfully show up to such an extent that people will feel like crap for even owning an xbox one. There will be amazing looking games on the Xbox One regardless of what the numbers are, and while I completely understand why these numbers would be such a very big deal around here, I don't feel it will matter to the broader audience of gamers that just want great looking games regardless of where they find them. As an example, despite the Xbox 360 regularly having the superior versions of multi-platform games, there isn't a thing you could tell the folks that bought the PS3 version of games that would convince them that they were somehow getting a less than stellar gaming experience. If the consoles are being viewed in a purely competitive light, and only ever under such pretenses, then the differences between the two systems is really all that matters. However, most gamers really don't see things that way. They get a system, they buy games, and then they enjoy themselves.

But then I suppose I totally understand how that view might seem pretty out of place in a thread primarily about the differences in power between the two systems, so I've probably said about as much as is necessary on the matter. Other people see things differently, and I respect that.

Did you just write that? I am playing Red Dead Redemption at the moment and guess what? Every second I am playing that game, I am telling myself "I AM PLAYING THE SHITTIEST VERSION OF THIS GAME". So please stop! Just stop!

Edit: I own a PS3 and I only imagine how much better that beautiful world Rockstar created looks on the 360. Cause I've seen the side by side comparisons
 
I'm "clinging" to a stated fact that was right out of the Xbone software engineer's mouth. Its not like I'm making any of this up, and using a straw man's argument to fit my opinions. Evolution is about natural growth, because you are now better equipped to deal with an environment or situation than you were previously.

So why don't you start replying to the posts refuting your spin?

Here is the problem with ESRAM on the Xbox One compared to the EDRAM on 360: the One is 6 times more powerful but the main memory bandwidth is only 3 times greater. This places more pressure on the embedded memory to bridge the gap, but it is only 3 times larger and is actually slower. So both the bandwidth and capacity ratios to the GPU power are quite a bit worse requiring a lot more optimization. And the proliferation of deferred rendering techniques and the large buffers and numerous render targets they employ only exacerbate the situation.

thx to esram 720 to900 wilb e nte norm on xone

No one is saying that the 360 was difficult they were saying it was "more difficult than not having edram" that doesn't add much complexity.

and of course the PS4 is relevant in this, the difficulty of development is entirely based on the difficulty of development for the opposition console or PC, That's what gives us a baseline of what is easy and what is hard. It's like playing a sport, it's only as hard as whoever your playing against.

Let's look again at that quote you posted:


Ok so a new ability to texture out of esram, you know what this means? new API for it. But 32 mb is small? can you put full textures in there or are you forced to use PRT? I'd guess PRT, new API for it. So you would have to use 2 new APIs to do this, how good are these new APIs? we don't know, from the developer quote in the OP of this thread we could say "horrible"



Do you know what this means? It means the memory is now addressable. that means the developers can choose where to put things, This allows for more possibilites, HOWEVER, this means developers need to put things in the right place, now that we know they do this, let's look again at the esram and the "1024 bit bus"

It's not a 1024 bit bus, it's 4x 256 bit bus's linked on 4 pipelines each of 8mb esram, the developers need to manage these as they are now addressable to get access to the 109gb/s read speed, it seems you need to have your data in all four "8mb chunks"

Esram is not edram, the Article you quoted is MS PR and shows falsehoods even in the small snippet you quoted.

You can add to that the Edge developer comment from the OP.

So we have GAF Junior Member and MS PR interview vs the rest of the world.
 
Well said

He makes some fantastic points, but I feel game content below 1080p scales very, very well on my 1080p television. I think it really comes down to the quality of the scaler in a person's television, or the quality of the scaler in the system itself.

My HDTV is native 1080p, but games like Halo 4, Uncharted 2, GTA V, and a multitude of others, all look quite amazing on it. Pretty much the only issue I ever run into is not being able to see the text on the user interface as well I'd would like. My go to examples to point this out currently are reading text messages on the cell phone in GTA V, or trying to get a good look at where cops are on the mini-map when trying to escape from them. So, my main argument is if you have a native 1080p television right now, and don't have any major complaints about 720p games on the 360 or PS3, why then would already non-existent issues pertaining to scaling potentially be exacerbated by the more powerful Xbox One, which will produce much superior graphics quality at even higher resolutions than the 360 or PS3?

Did you just write that? I am playing Red Dead Redemption at the moment and guess what? Every second I am playing that game, I am telling myself "I AM PLAYING THE SHITTIEST VERSION OF THIS GAME". So please stop! Just stop!

Edit: I own a PS3 and I only imagine how much better that beautiful world Rockstar created looks on the 360. Cause I've seen the side by side comparisons

Rather than telling me to stop, just accept that you and I have very different views. Although, in fairness, I do have the 360 version of Red Dead Redemption. I don't really know what the PS3 version of that game is like, but a major flaw in how we sometimes view things, I think, is that we readily accept that, due to the architectural similarities between the two systems this time around, the performance gap between them is even more telling or significant than was the case for whatever gap existed between the 360 and PS3. That we all accept.

However, at the very same time, we don't seem to readily acknowledge the fact that the Xbox One architecture is likely nowhere nearly as difficult for programmers to extract performance from as the PS3 was, which means there's a very high probability we also won't be seeing any Xbox One bayonetta style porting disasters on the system, which may significantly combat the "wow, this game is so unplayable on the Xbox One" complaints, like you see some people say about PS3 versions of certain multi-platforms. There is no cell processor on the Xbox One. ESRAM may be a challenge, but it isn't PS3 architecture levels of challenging, at least certainly not from my understanding. So, really, the PS4 versions of multi-platforms will no doubt likely be superior, but just don't be surprised if the Xbox One version isn't absolute "shit" in terms of graphics quality and framerate. Many of the things gamers had to deal with on the current gen systems won't apply this time around. Solid to great AF with some manner of AA seems a given, along with high quality textures and shaders. I mean, we'll have to see, but I get the feeling the "ugly duckling" version this upcoming gen won't be quite so ugly compared to what we might have seen on the 360 and PS3.
 

lumzi23

Member
If you look at my post history, you will see I've been active in all the hardware rumour threads around the next gen consoles.

You're most certainly incorrect. All the smart money was on the Xbox 3 being more powerful.

This was true when the first hardware leaks started as well. It was only the tail end of last year (weeks before the reveal of the PS4) that the Orbis was edging out in front, and even then there was lots of talk about secret sause / dual GPUs / dual SoCs of the Durango that would mean it eclipsed the Orbis in power.

I'll take your word for it but all I know is when I came here to check out GAF's reaction (which I admit fits your time frame) Sony was always ahead in opinions.
 

vcc

Member
I don't feel it will matter to the broader audience of gamers that just want great looking games regardless of where they find them. As an example, despite the Xbox 360 regularly having the superior versions of multi-platform games, there isn't a thing you could tell the folks that bought the PS3 version of games that would convince them that they were somehow getting a less than stellar gaming experience.

I'm not so sure, the 360 cleanly beat the PS3 in attach rate partly because for the majority of it's lifespan the 360 had the better version of important multi-platform games (GTA4, COD, BF3, etc...).

People like the notion of value, and $100 cheaper for better performance is a major selling point and the idea hit the mainstream. It was a simple easy to explain idea and it was news. The news loves that sort of narrative; a industry giant brought low by it's hubris. It's they narrative they wrote about Atari, Nintendo, and Sony. Now Microsoft and the narrative will inform purchase decisions.

We'll see how it all turns out but MS needs to cut out the 'we're not worried' position and needs to start taking their adversaries more seriously. If they bow out of the industry it isn't a good thing for the consumer.
 

nib95

Banned
It's nice to want things I guess.

Didn't mean to come off as arrogant. I appreciate everybody has different finances, priorities, outgoings, interests, hobbies and all the rest. Ultimately gaming is just a luxury, and putting less of one's finance towards it might not necessarily be a bad thing, especially if you have children, a mortgage and all the rest.
 

kaching

"GAF's biggest wanker"
I don't feel it will matter to the broader audience of gamers that just want great looking games regardless of where they find them.
If people just went about stumbling upon consoles by happenstance, I suppose that would be true. Otherwise it's an awfully convenient threshold of tolerance.
 

CLEEK

Member
I'll take your word for it but all I know is when I came here to check out GAF's reaction (which I admit fits your time frame) Sony was always ahead in opinions.

Before any tech specs were rumoured, the common thought was Sony's financial position meant it couldn't afford any losses, and couldn't afford to sink hundreds of millions into R&D. Sony publicly stated that R&D into Cell alone cost them $400m. With these two facts in mind, the thought was the PS4 would be based on lost cost hardware, and the Xbox 3 would kill it off by having power to spare.

You also had chumps like SuperDAE swearing blind that the Durango was all powerful and the Orbis was a joke. This was echoed in places like B3D, where the smarter folks there shared his sentiment. The fact the PS4 is significantly more powerful than the Xbone is something that only came about from the start of this year, slowly clarifying over time as more is known about the Xbox One.
 

BigJoeGrizzly

Neo Member
As a former Genetics majors I'd like to point out the skeff is right. A fair amount of change is just drift without selection. Evolution doesn't have a direction, critical traits are selected for and against strongly but non critical ones can languish or vanish or become omni-present.

As for the ESRAM; it's a 'evolution' with an selective criteria of 'make it cheap', 'make it easy to shrink' and 'be okay for apps and gaming' and it seems in that order. The evolution he may be referring to is the higher bandwidth and larger amount versus the EDRAM; but it says nothing about it's relative comparison to the PS4.

Even so, you'd recall he's on the clock. That engineer, all the people from MS, Mark Cerny and all of the people from Sony have a transparent agenda. To feature their machine in the best light.

Skeff is also right, it's the relative difficulty. For the PS3 vs 360 the EDRAM was a bonus. Almost everything it offered was 'free' in comparison tot he PS3. The bar was low and the expectation for it's use low. The 360 GPU was already more capable and the architecture has less bottlenecks.

For the XB1 vs PS4 the ESRAM is not a free extra anymore. It's now required to overcome a bottle neck so more time will be spent tinkering with it to match the memory throughput of the PS4. By all accounts it's not NEARLY as hard as juggling 6 exotic mini processors that the CELL was but it is harder than the PS4 which seems more direct than current commodity PC's.

As a former biology/psychology double-major does that now give me the right to have my interpretation of what "evolution" means? I'm not talking about the evolution of species here, I'm talking the evolution of technology, which in many ways is a different interpretation. Evolution (as you know as a former genetics major) relative to species has more to do with adaptation, survival, and the ability to pass those genes to future generations. Evolution in technology also has similar principles, but is mainly used loosely as another way to describe the improvements of tech from past to present.

And "relativity" was not originally the argument some here were making when they attempted to pick apart my initial statements (the adjustment to a "relativity" argument against my statements by some has just started to surface) . If someone wanted to say that the PS4's memory architecture is easier to grasp, that's all they have to say, "the PS4's memory architecture is easier to develop for." The problem is people start suggesting "the Xbone memory architecture is a bitch to develop for" (which was originally mentioned to me by Sword of Doom who said "a developer" said it in the EDGE article). Look, figuring out the answer to 2+2 compared to 2x12 is definitely easier (and 2x12 is more "difficult" to figure out compared to 2+2), but in the grand scheme of things they're both pretty easy to figure out.
 

Ashes

Banned
I'll take your word for it but all I know is when I came here to check out GAF's reaction (which I admit fits your time frame) Sony was always ahead in opinions.

I don't entirely support his point, but I'd agree with his assertions. X3 was king. Monster 8 core CPU, 8gb ram. All that jazz. And with rumours suggesting that orbis was behind schedule, and getting jaguar instead of steamroller, with 2gb ram, maybe 4. I'd say x3 was ahead with its xeon powered dev kits.
 
If I had a choice of Ryse at 1080p, but with the game doing less technically, compared to what we now have at 900p, I take the 900p version.
The thing is, you do not know what we now have at 900p. The only direct-feed screenshots and videos released have been rendered at 1080p (higher than the game actually will) or downscaled to 720p (from an unknown source resolution, giving free AA). Now, I don't expect the game to look considerably worse in terms of geometry, textures, etc. But I do expect the game to have more jaggies and shimmering than any media we've seen to date. I'm not sure that's an acceptable tradeoff for tons of blades of grass.
 

viveks86

Member
He makes some fantastic points, but I feel game content below 1080p scales very, very well on my 1080p television. I think it really comes down to the quality of the scaler in a person's television, or the quality of the scaler in the system itself.

My HDTV is native 1080p, but games like Halo 4, Uncharted 2, GTA V, and a multitude of others, all look quite amazing on it. Pretty much the only issue I ever run into is not being able to see the text on the user interface as well I'd would like. My go to examples to point this out currently are reading text messages on the cell phone in GTA V, or trying to get a good look at where cops are on the mini-map when trying to escape from them. So, my main argument is if you have a native 1080p television right now, and don't have any major complaints about 720p games on the 360 or PS3, why then would already non-existent issues pertaining to scaling potentially be exacerbated by the more powerful Xbox One, which will produce much superior graphics quality at even higher resolutions than the 360 or PS3?


Firstly, you just highlighted some issues that probably will not go away with upscaling xbox one games. Is it not reasonable for people to want those issues (however minor you think they are) go away with a new generation?

Secondly, you have assumed there are no major complaints with upscaling 720p on PS3 and 360. They might not be major complaints for you and I. But high-end PC gamers who are used to impeccable image quality will consider it a major complaint when they play console exclusives. I'm sure you would agree that image quality is quite crappy for many of the top-end games that push the boundaries of the hardware. So for people who care about image quality, don't you think those are valid complaints? Sure you can anti-alias the heck out of it with additional horsepower on next gen, but as SPE pointed out, they would still not be as good as native resolution.
 

Skeff

Member
As a former biology/psychology double-major does that now give me the right to have my interpretation of what "evolution" means? I'm not talking about the evolution of species here, I'm talking the evolution of technology, which in many ways is a different interpretation. Evolution (as you know as a former genetics major) relative to species has more to do with adaptation, survival, and the ability to pass those genes to future generations. Evolution in technology also has similar principles, but is mainly used loosely as another way to describe the improvements of tech from past to present.

And "relativity" was not originally the argument some here were making when they attempted to pick apart my initial statements (the adjustment to a "relativity" argument against my statements by some has just started to surface) . If someone wanted to say that the PS4's memory architecture is easier to grasp, that's all they have to say, "the PS4's memory architecture is easier to develop for." The problem is people start suggesting "the Xbone memory architecture is a bitch to develop for" (which was originally mentioned to me by Sword of Doom who said "a developer" said it in the EDGE article). Look, figuring out the answer to 2+2 compared to 2x12 is definitely easier (and 2x12 is more "difficult" to figure out compared to 2+2), but in the grand scheme of things they're both pretty easy to figure out.

So why don't you start replying to the posts refuting your spin?







You can add to that the Edge developer comment from the OP.

So we have GAF Junior Member and MS PR interview vs the rest of the world.

.
 

BigJoeGrizzly

Neo Member
So we have GAF Junior Member and MS PR interview vs the rest of the world.

1) What the hell does being a "Junior Member" have to do with a person's ability to have valid points and opinions? Didn't know "Junior Member" = ignore list.

2) Suggesting the Digital Foundry interview as "MS PR" is subjective at best. Should I suggest that any future Sony-related interviews are strictly "Sony PR" and all information within should be considered invalid?

3) When did NeoGaf = the rest of the world?
 
If someone wanted to say that the PS4's memory architecture is easier to grasp, that's all they have to say, "the PS4's memory architecture is easier to develop for." The problem is people start suggesting "the Xbone memory architecture is a bitch to develop for" (which was originally mentioned to me by Sword of Doom who said "a developer" said it in the EDGE article).
But from what I can see, your only rebuttal has been that "360 wasn't tough, why would One be?" And several people have explained to you exactly why: the eSRAM setup in One is not like the eDRAM in 360. To which you have replied, "Yeah, Microsoft has said it's an evolution". And the repeated response has been that yes, it has evolved capacity. But it's harder to program in order to allow that greater flexibility.

Sure, programmers are smart guys and the eSRAM isn't insurmountable. But there's plenty of hard evidence that it's more difficult to use than the 360's eDRAM.
 
1) What the hell does being a "Junior Member" have to do with a person's ability to have valid points and opinions? Didn't know "Junior Member" = ignore list.

2) Suggesting the Digital Foundry interview as "MS PR" is subjective at best. Should I suggest that any future Sony-related interviews are strictly "Sony PR" and all information within should be considered invalid?

3) When did NeoGaf = the rest of the world?

Waiting for your answers. You got your explanations why the ESRAM is bad. Not happy?
 
Top Bottom