I have a pretty powerful PC, and all my gaming is done at 1680x1050
Why build a great gaming machine and never upgrade your monitor?
I have a pretty powerful PC, and all my gaming is done at 1680x1050
http://reviews.cnet.com/lcd-monitors/samsung-syncmaster-2233rz/4505-3174_7-33499496.html
The better of my two pc monitor's top resolution is 1680x1050.
I don't really have a choice in the matter, but I also don't feel I'm missing very much. The games still look unbelievable at really high graphics settings.
Eh? I'm not suggesting you need to dwell on others' disappointments. You seem to present your case repeatedly as if you expect others to be convinced. If all you care about is your own opinion and not others', then why bother repeatedly justifying your stand? All I'm trying to tell you is that people's thresholds are different. You can keep talking about your own, but it doesn't really add anything to the conversation if you aren't willing to understand how others look at it as well.
Why build a great gaming machine and never upgrade your monitor?
If you go back a year or so before the unveiling, MS were favoured in power just because of their last platforms and finances. But as we received more information the picture became more clear, and there were a lot of account suicides over MS losing the position of power leader.
No, I think some one reliable (forget who it was) dropped something along the lines that the Xbone was a "Supah-computah".
Based on that there was a window (however small it was) where it was expected that the Xbone would continue in 360's reign.
Does not compute
If nothing else, they appear to have gone out of their way to avoid such a scenario this time around, so I somehow doubt another RROD scenario. A healthy bit of their engineering focus seems to have been on precisely that issue. On the 1080p front, yea, I can see developers quite regularly opting for resolutions lower than 1080p to provide themselves some extra freedom to do some more with their game, and I quite honestly prefer it that way. If they can hit 1080p, and still make the game look and run great, then I'm all for it, but I don't see anything wrong with resolutions lower than 1080p. I'd take a higher level of graphics quality over a higher resolution everytime. As long as we aren't dealing with sub 720p resolutions, which I honestly don't see happening. Unless you have the performance muscle to spare, the benefit of 1080p is simply greatly outweighed by making an even more impressive looking game at lower resolutions.
Take Quantum Break as an example, no way in hell do I want Remedy to limit their vision because they somehow feel they must run at 1080p. Go as low as they need to, as long as that doesn't mean sub Anything at or above, I'm totally fine with.
http://reviews.cnet.com/lcd-monitors/samsung-syncmaster-2233rz/4505-3174_7-33499496.html
The better of my two pc monitor's top resolution is 1680x1050.
32mb esram is not big enough for many deferred rendering buffers.
For example Killzone shadowfall is around 47mb and BF3 on PC at 1080p was around 55mb, they are bigger than 32mb.
I don't remember the exact numbers for the buffers, but they're about right.
The PS4 does not need to struggle fit a buffer into memory as all 8gb of the PS4 memory is very high speed.
Reducing the same 55mb buffer from 1080p to 720p would likely make it lower than 32mb and fit in esram
In regards to the ability to run 1080p, I would have figured that it would be more due to a disparity between the GPUs rather than the memory configuration. Aren't there current gen games that hit 1080p with the 360's current memory setup which uses a lot of the same principles as the One?
Also, I wouldn't think that 1080p is a barrier in and of itself, developers should be able to target 1080p, I would think that the distinction is what they have to give up in the texture quality and effects that is the trade-off as to why they wouldn't hit it.
That may be the reality of the situation, and I'm fine with that. I have a pretty powerful PC, and all my gaming is done at 1680x1050, and I think the new consoles are capable of a level of graphics performance now where a lower resolution doesn't exactly strike me as the end of the world. Look at the better looking 720p titles this generation. I didn't see very many complaints about those, and certainly high quality efforts on the Xbox One will easily top those, and should be almost certain to sport far better image quality.
I'm not criticizing or attacking anyone else's view. I'm giving my view. Big difference.
720p is your new standard? Going 1080p means limiting developer's vision? Wut now?
That's the problem I'm trying to highlight, dear sir. I've seen so many posts from you where all you are saying is "This is what I care about. The other stuff doesn't matter." That's your preference. And that's fine. You can't present that as a valid argument. It is subjective.
PS3 is capable of delivering 1080p....just because it can, doesn't mean it was truly designed for it.
There are a number of limitations of the Xbox One hardware - ROPS, CU's, Bandwidth, that will make 1080p more difficult to achieve.
Ultimately I think we'll see more non-1080p games on Xbox One than 1080p titles. We already see that for launch titles...
Hence, I don't see how you can say it was designed for 1080p.
Unless you have the performance muscle to spare, the benefit of 1080p is simply greatly outweighed by making an even more impressive looking game at lower resolutions.
http://reviews.cnet.com/lcd-monitors/samsung-syncmaster-2233rz/4505-3174_7-33499496.html
The better of my two pc monitor's top resolution is 1680x1050.
I don't really have a choice in the matter, but I also don't feel I'm missing very much. The games still look unbelievable at really high graphics settings.
Well, that wouldn't be too surprising if it did. The PS4 is the clearly stronger system after all. The real question is just what does less eye candy on high quality xbox one releases actually mean? I suspect that people won't be too disappointed by what 343i does with the next Halo, or what Remedy does with Quantum Break. Look at what Crytek are already doing with Ryse at launch. If that's what lower resolution and less eye candy means for the system, I don't think xbox one gamers will exactly be scraping the very bottom of the graphics barrel, do you know what I am saying?
I suspect that people won't be too disappointed by what 343i does with the next Halo
I think this is what we call 'revising history.'
Even before the PS4 unveiling the rumors always supported PS4 as more powerful. And the fans have always capitalized on this. Lets keep this straight.
So to sum everything up, you're basically saying "It's good enough"
Ahh so 720-900p will be the norm while Forza remains at 1080p. Also the February event fucked up Microsoft's plans.
Thanks Butts.
And I remember Major Nelson passing off that sony event as nothing, posting pictures of MS guys watching it and suggesting that there was popcorn (iirc). The messaging from MS on that event made it seem like it didn't effect them at all. It made me interested in seeing what they had up their sleeves because of that, then I see TV TV Sports TV.
There is one massive reason why hitting native 1080p is more important next gen than 720p vs sub-HD this gen. It's the native res of people's TVs. This gen, even if you had a 720p HDTV, it was highly unlikely your HDTV was 1280x720, so there was scaling going on.
You mention you play your PC games 1680x1050. For such an arbitrary resolution, I assume it's the native res of your monitor? Therefore the optimal native res a game can ever run at.
Just look at the Vita. The single biggest factor in IQ in its games is running at native 960x544. If non-native, it doesn't matter how many effects or quality of AA a game has, the image will always look cleaner and better at native res.
So it has nothing to do with trade offs between effects and performance. Just the simple fact that if you have a 1080p TV, native 1080p will always produce the best IQ, so should always be what devs target. It's fucking insane if the very design of the Xbone eSRAM prohibits this.
My mind keeps getting blown lately. SenjutsuSage despite all his tech talk, games on a 22" 900p monitor (wut), and KKRT00 with all his tech talk and graphical championing, games on a PC rig armed with a GTX 560 (double wut).
Hmm...
Well put.
He's been full of stinky shit ever since he joined. Nothing new.
My mind keeps getting blown lately. SenjutsuSage despite all his tech talk, games on a 22" 900p monitor (wut), and KKRT00 with all his tech talk and graphical championing, games on a PC rig armed with a GTX 560 (double wut).
Hmm...
This is just an 'in general' statement, but man, PC gamers really try to ruin all the fun this console generation. It's so boring. You can already get a PC, the consoles aren't out yet -- stop trying to embed the PC into every discussion, it's so tedious and obnoxious. It's so odd that the PC-only gamers act like they are trying to better others or educate people on how it is possible to game on a PC. I'm pretty sure everyone knows this. I never thought there would be a gaming community that out annoys even the most obnoxious of "fanboys".
My mind keeps getting blown lately. SenjutsuSage despite all his tech talk, games on a 22" 900p monitor (wut), and KKRT00 with all his tech talk and graphical championing, games on a PC rig armed with a GTX 560 (double wut).
Hmm...
One of the reasons the PS3/360 never hit 1080p consistently isn't that they weren't designed for it. They were but the standards of Image Quality got high enough that the trade off for 1080p vs 720p with better IQ wasn't worth it.
Similarly the XB1 was designed for 1080p as was the PS4 but the image quality targets between the two was different and it seems the standard is higher than the XB1 can deliver. So they have to compromise resolution.
The problem for Microsoft is that it very much appears as if the PS4 has that extra muscle to spare. 900p vs 1080p is around 40% more pixels and the PS4 vs the XB1 has more than 40% more capacity in every part of the rendering pipeline the PS4 has the XB1 by 40% or more. The only exception is in raw memory bandwidth; if the ESRAM and DDR3 is being used together you get parity at around 175GB/s (the 200GB ESRAM number is incredibly dumb, it's a one off trick they projected into a impossible number.).
I can't get into detail without giving up what I do, which I can't do. So I'll just concede. I honestly should have never brought that up if I could not reliably defend it. For that I apologize
Well insider card, see a mod or ban. But let's carry on anyway:
You can easily show us the math to show that a render target "easily fit's into 32mb" I'll show you my side of our disagreement:
it would almost definitely be 32 bit colour depth. We could make the case it wouldn't fit in 32mb depending on the techniques used, for example with 4xFSAA and 1080p we'd be looking at:
Back Buffer:
1920x1080 [Resolution] * 32 [Bits Per Pixel] * 4[FSAA Depth]
= 265420800 bits = 31.6MB
Depth Buffer:
1920x1080 [Resolution] * 32 [24Bit Z, 8Bit Stencil] * 4 [FSAA Depth]
= 265420800 bits = 31.6MB
Front Buffer:
1920x1080 [Resolution] * 32 [24Bit Z, 8Bit Stencil]
= 66355200 bits = 7.91MB
Total Memory Requirements:
31.6 + 31.6 + 7.91 = 71.11MB
It's not quite as simple as you put it.
Without the AA you'd be looking at around 24mb.
Though this method is becoming less and less used in games development:
That's not applicable at all to modern rendering techniques.
Skeff further up is closer but probably not familiar with deferred shading. When you are using deferred shading, you need to store all the input data for your later lighting calculations in your render buffer (usually called a g-buffer). The exact amount of memory you need to do this depends on the engine and what you are going for, but you're unlikely to get by with less than 16 bytes per pixel.
Quite.
Further to my comment on a previous page, I looked up the more exact figure for BF3's gbuffers @ 1080p/4xMSAA: 158MB.
http://www.slideshare.net/fullscreen/DICEStudio/shiny-pc-graphics-in-battlefield-3/20
You could work from that figure for 1080p/NoAA. Though BF4's may differ.
Without AA?
That works out to 80 bytes per pixel for 4xAA. Assuming it scales linearly with AA samples (which is an absolute worst case scenario) that would mean 20 bytes per pixel for no AA. Which would result in ~40 MB buffer size at 1080p.
If we were to look at Durante's minimum suggested of 16 Bytes per pixel we'd be looking at around 32mb which is about as small as you could go which would be 31.64mb.
This is essentially the MINIMUM you could get away with, Most games would require much higher.
There is one massive reason why hitting native 1080p is more important next gen than 720p vs sub-HD this gen. It's the native res of people's TVs. This gen, even if you had a 720p HDTV, it was highly unlikely your HDTV was 1280x720, so there was scaling going on.
You mention you play your PC games 1680x1050. For such an arbitrary resolution, I assume it's the native res of your monitor? Therefore the optimal native res a game can ever run at.
Just look at the Vita. The single biggest factor in IQ in its games is running at native 960x544. If non-native, it doesn't matter how many effects or quality of AA a game has, the image will always look cleaner and better at native res.
So it has nothing to do with trade offs between effects and performance. Just the simple fact that if you have a 1080p TV, native 1080p will always produce the best IQ, so should always be what devs target. It's fucking insane if the very design of the Xbone eSRAM prohibits this.
And I remember Major Nelson passing off that sony event as nothing, posting pictures of MS guys watching it and suggesting that there was popcorn (iirc). The messaging from MS on that event made it seem like it didn't effect them at all. It made me interested in seeing what they had up their sleeves because of that, then I see TV TV Sports TV.
If that proves to be true, let's just hope that doesn't bring ps4's versions down too.
I'm mostly worried about EA, but we'll know soon enough when BF4 is released.
why does this blow your mind? console gamers go ga ga over graphics but play on er consoles. How many GT/kz2/uc gifs exist to show awesome gifs? Their view is common enough.
It blows my mind because I wrongly expected people who were heavily devoted to tech talk and graphical fidelity to have invested somewhat in to extracting more in that regard. 22" 900p monitors and GTX 560's are hardly the epitome of high tech. It's no surprise for example, SenjutsuSage would care so little for 1080p if his monitor doesn't even offer the resolution.
I don't disagree, but I also don't see it as a problem that will meaningfully show up to such an extent that people will feel like crap for even owning an xbox one. There will be amazing looking games on the Xbox One regardless of what the numbers are, and while I completely understand why these numbers would be such a very big deal around here, I don't feel it will matter to the broader audience of gamers that just want great looking games regardless of where they find them. As an example, despite the Xbox 360 regularly having the superior versions of multi-platform games, there isn't a thing you could tell the folks that bought the PS3 version of games that would convince them that they were somehow getting a less than stellar gaming experience. If the consoles are being viewed in a purely competitive light, and only ever under such pretenses, then the differences between the two systems is really all that matters. However, most gamers really don't see things that way. They get a system, they buy games, and then they enjoy themselves.
But then I suppose I totally understand how that view might seem pretty out of place in a thread primarily about the differences in power between the two systems, so I've probably said about as much as is necessary on the matter. Other people see things differently, and I respect that.
I'm "clinging" to a stated fact that was right out of the Xbone software engineer's mouth. Its not like I'm making any of this up, and using a straw man's argument to fit my opinions. Evolution is about natural growth, because you are now better equipped to deal with an environment or situation than you were previously.
Here is the problem with ESRAM on the Xbox One compared to the EDRAM on 360: the One is 6 times more powerful but the main memory bandwidth is only 3 times greater. This places more pressure on the embedded memory to bridge the gap, but it is only 3 times larger and is actually slower. So both the bandwidth and capacity ratios to the GPU power are quite a bit worse requiring a lot more optimization. And the proliferation of deferred rendering techniques and the large buffers and numerous render targets they employ only exacerbate the situation.
thx to esram 720 to900 wilb e nte norm on xone
No one is saying that the 360 was difficult they were saying it was "more difficult than not having edram" that doesn't add much complexity.
and of course the PS4 is relevant in this, the difficulty of development is entirely based on the difficulty of development for the opposition console or PC, That's what gives us a baseline of what is easy and what is hard. It's like playing a sport, it's only as hard as whoever your playing against.
Let's look again at that quote you posted:
Ok so a new ability to texture out of esram, you know what this means? new API for it. But 32 mb is small? can you put full textures in there or are you forced to use PRT? I'd guess PRT, new API for it. So you would have to use 2 new APIs to do this, how good are these new APIs? we don't know, from the developer quote in the OP of this thread we could say "horrible"
Do you know what this means? It means the memory is now addressable. that means the developers can choose where to put things, This allows for more possibilites, HOWEVER, this means developers need to put things in the right place, now that we know they do this, let's look again at the esram and the "1024 bit bus"
It's not a 1024 bit bus, it's 4x 256 bit bus's linked on 4 pipelines each of 8mb esram, the developers need to manage these as they are now addressable to get access to the 109gb/s read speed, it seems you need to have your data in all four "8mb chunks"
Esram is not edram, the Article you quoted is MS PR and shows falsehoods even in the small snippet you quoted.
Well said
Did you just write that? I am playing Red Dead Redemption at the moment and guess what? Every second I am playing that game, I am telling myself "I AM PLAYING THE SHITTIEST VERSION OF THIS GAME". So please stop! Just stop!
Edit: I own a PS3 and I only imagine how much better that beautiful world Rockstar created looks on the 360. Cause I've seen the side by side comparisons
If you look at my post history, you will see I've been active in all the hardware rumour threads around the next gen consoles.
You're most certainly incorrect. All the smart money was on the Xbox 3 being more powerful.
This was true when the first hardware leaks started as well. It was only the tail end of last year (weeks before the reveal of the PS4) that the Orbis was edging out in front, and even then there was lots of talk about secret sause / dual GPUs / dual SoCs of the Durango that would mean it eclipsed the Orbis in power.
I don't feel it will matter to the broader audience of gamers that just want great looking games regardless of where they find them. As an example, despite the Xbox 360 regularly having the superior versions of multi-platform games, there isn't a thing you could tell the folks that bought the PS3 version of games that would convince them that they were somehow getting a less than stellar gaming experience.
It's nice to want things I guess.
If people just went about stumbling upon consoles by happenstance, I suppose that would be true. Otherwise it's an awfully convenient threshold of tolerance.I don't feel it will matter to the broader audience of gamers that just want great looking games regardless of where they find them.
I'll take your word for it but all I know is when I came here to check out GAF's reaction (which I admit fits your time frame) Sony was always ahead in opinions.
As a former Genetics majors I'd like to point out the skeff is right. A fair amount of change is just drift without selection. Evolution doesn't have a direction, critical traits are selected for and against strongly but non critical ones can languish or vanish or become omni-present.
As for the ESRAM; it's a 'evolution' with an selective criteria of 'make it cheap', 'make it easy to shrink' and 'be okay for apps and gaming' and it seems in that order. The evolution he may be referring to is the higher bandwidth and larger amount versus the EDRAM; but it says nothing about it's relative comparison to the PS4.
Even so, you'd recall he's on the clock. That engineer, all the people from MS, Mark Cerny and all of the people from Sony have a transparent agenda. To feature their machine in the best light.
Skeff is also right, it's the relative difficulty. For the PS3 vs 360 the EDRAM was a bonus. Almost everything it offered was 'free' in comparison tot he PS3. The bar was low and the expectation for it's use low. The 360 GPU was already more capable and the architecture has less bottlenecks.
For the XB1 vs PS4 the ESRAM is not a free extra anymore. It's now required to overcome a bottle neck so more time will be spent tinkering with it to match the memory throughput of the PS4. By all accounts it's not NEARLY as hard as juggling 6 exotic mini processors that the CELL was but it is harder than the PS4 which seems more direct than current commodity PC's.
You also had chumps like SuperDAE swearing blind that the Durango was all powerful and the Orbis was a joke. He was maintaining this up to the end, well past the official PS4 reveal.
I'll take your word for it but all I know is when I came here to check out GAF's reaction (which I admit fits your time frame) Sony was always ahead in opinions.
The thing is, you do not know what we now have at 900p. The only direct-feed screenshots and videos released have been rendered at 1080p (higher than the game actually will) or downscaled to 720p (from an unknown source resolution, giving free AA). Now, I don't expect the game to look considerably worse in terms of geometry, textures, etc. But I do expect the game to have more jaggies and shimmering than any media we've seen to date. I'm not sure that's an acceptable tradeoff for tons of blades of grass.If I had a choice of Ryse at 1080p, but with the game doing less technically, compared to what we now have at 900p, I take the 900p version.
He makes some fantastic points, but I feel game content below 1080p scales very, very well on my 1080p television. I think it really comes down to the quality of the scaler in a person's television, or the quality of the scaler in the system itself.
My HDTV is native 1080p, but games like Halo 4, Uncharted 2, GTA V, and a multitude of others, all look quite amazing on it. Pretty much the only issue I ever run into is not being able to see the text on the user interface as well I'd would like. My go to examples to point this out currently are reading text messages on the cell phone in GTA V, or trying to get a good look at where cops are on the mini-map when trying to escape from them. So, my main argument is if you have a native 1080p television right now, and don't have any major complaints about 720p games on the 360 or PS3, why then would already non-existent issues pertaining to scaling potentially be exacerbated by the more powerful Xbox One, which will produce much superior graphics quality at even higher resolutions than the 360 or PS3?
As a former biology/psychology double-major does that now give me the right to have my interpretation of what "evolution" means? I'm not talking about the evolution of species here, I'm talking the evolution of technology, which in many ways is a different interpretation. Evolution (as you know as a former genetics major) relative to species has more to do with adaptation, survival, and the ability to pass those genes to future generations. Evolution in technology also has similar principles, but is mainly used loosely as another way to describe the improvements of tech from past to present.
And "relativity" was not originally the argument some here were making when they attempted to pick apart my initial statements (the adjustment to a "relativity" argument against my statements by some has just started to surface) . If someone wanted to say that the PS4's memory architecture is easier to grasp, that's all they have to say, "the PS4's memory architecture is easier to develop for." The problem is people start suggesting "the Xbone memory architecture is a bitch to develop for" (which was originally mentioned to me by Sword of Doom who said "a developer" said it in the EDGE article). Look, figuring out the answer to 2+2 compared to 2x12 is definitely easier (and 2x12 is more "difficult" to figure out compared to 2+2), but in the grand scheme of things they're both pretty easy to figure out.
So why don't you start replying to the posts refuting your spin?
You can add to that the Edge developer comment from the OP.
So we have GAF Junior Member and MS PR interview vs the rest of the world.
So we have GAF Junior Member and MS PR interview vs the rest of the world.
But from what I can see, your only rebuttal has been that "360 wasn't tough, why would One be?" And several people have explained to you exactly why: the eSRAM setup in One is not like the eDRAM in 360. To which you have replied, "Yeah, Microsoft has said it's an evolution". And the repeated response has been that yes, it has evolved capacity. But it's harder to program in order to allow that greater flexibility.If someone wanted to say that the PS4's memory architecture is easier to grasp, that's all they have to say, "the PS4's memory architecture is easier to develop for." The problem is people start suggesting "the Xbone memory architecture is a bitch to develop for" (which was originally mentioned to me by Sword of Doom who said "a developer" said it in the EDGE article).
1) What the hell does being a "Junior Member" have to do with a person's ability to have valid points and opinions? Didn't know "Junior Member" = ignore list.
2) Suggesting the Digital Foundry interview as "MS PR" is subjective at best. Should I suggest that any future Sony-related interviews are strictly "Sony PR" and all information within should be considered invalid?
3) When did NeoGaf = the rest of the world?