• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

150Mhz CPU boost on XBO, now in production

Status
Not open for further replies.

Spongebob

Banned
For what it's worth, I still find the assertion of there being [x] percentage difference in graphical fidelity between different platform versions of a game to be a ridiculous notion. It's one thing to quantify the overall hardware muscle of a system - you've got actual numbers to quantify the difference between systems in different categories, with which you can sum up some vague, but not wholly inaccurate, general numerical difference between the systems' performance capabilities, but this whole idea of quantifying graphics is just silly when that can manifest in all sorts of ways, ways that are even purely subjective to one's tastes (from another thread, "I prefer PS1 graphics over PS2 graphics because fuck you it's my opinion"), and even ways that don't show up in the visuals rendered on the screen. (e.g. a PS4 version is less optimized because the extra horsepower is simply used to brute-force performance to achieve performance parity with the Xbone version; "lazy devs") So saying something like "these graphics are [x] percent better than those" just sounds silly to me.

I think what'd be more interesting by this point is if someone did a test to see what kind of different visual enhancements you can get out of a game when you swap between a 7770 and a 7850. That is, take some basic system (maybe try to make it at least somewhat similar to the power of the consoles), put in a 7770, take some game (Crysis 3 is a fun benchmark so let's say our fantasy test uses that), change settings around to get a fairly stable 30FPS in most gameplay scenarios (have some demos handy for testing purposes), then swap the 7770 with a 7850, and adjust the game settings to see what amount of enhancements you can add while still getting that same stable 30FPS framerate. The idea would be to make some general measures of what kind of differences could manifest between the two systems assuming 3rd party devs really try to leverage it; as an alternate take to a pure FPS comparison in something like these benchmarks, instead seeing what improvements can be made within a similar FPS envelope.

Naturally, it's really not all that necessary given it would be inherently inaccurate compared to software specifically coded for these consoles, and given that we're only a few more months from these things launching, but it'd still be something interesting to see in the meantime.
Excellent idea, someone needs to do this.
 
So, uh, not sure where to ask this, but what's the deal with some of the threads discussing the hardware differences between the 'Bone/PS4 getting locked now? Are we simply not allowed to bring those tweets into any discussion around here for some reason? Or was it just a "these threads have run their course" thing? Just want some clarification on this, since usually there's some mod note when a thread is closed, but in this case I'm left uncertain.


It is weird. This thread and others have tons more system wars talk and trolling. I thought the one with the Dev tweet and full explanation of his tweet was totally thread worthy then gets shut down out of nowhere. Who knows...
 
Gemüsepizza;80554185 said:
Good question. A short post from a mod in those closed threads would probably prevent further confusion.

Yep, and they scare me too, when i saw the option to "Quote" posts i thought... oh shiiii!!!... i got banned!!!...

A clarification would be nice.
 

QaaQer

Member
It is weird. This thread and others have tons more system wars talk and trolling. I thought the one with the Dev tweet and full explanation of his tweet was totally thread worthy then gets shut down out of nowhere. Who knows...

I thought it was fun + a dev was posting in it, good thread. maybe the dev asked it to be shut down?
 

Fafalada

Fafracer forever
Finalizer said:
An old friend of mine would've told you a different story about RE4...
RE4 was one game out of hundreds. And I could single out several(not many,but more than one) similar cases on 360->ps3.
We can argue theory to no end, but what market actually produced over their respective lifespans speaks for itself.
 

Jack_AG

Banned
Excellent idea, someone needs to do this.
I believe DF already did and the difference was only a few frames. Mind you, PCs are different beasts than consoles in terms of development, no matter how similar the hardware. You won't see PC games taking advantage of hardware like the X1/PS4.

Its disingenuous to even attempt it let alone use it as any sort of metric.
 

Jack_AG

Banned
They are a pretty small dev if I'm not mistaken. Hell a 2D shooter is using 50% of the PS4's power. Doesn't really mean much in the grand scheme of things.
Power. I hat this word. Let's just call it processor uptime, shall we? It was also a reference to CPU specifically, not overall throughput. At 1080p/60 Resogun does look mighty glorious considering those aren't particle effects - those are voxels. Lots of physics being thrown around gloriously.
 

Finalizer

Member
I believe DF already did and the difference was only a few frames. Mind you, PCs are different beasts than consoles in terms of development, no matter how similar the hardware. You won't see PC games taking advantage of hardware like the X1/PS4.

Its disingenuous to even attempt it let alone use it as any sort of metric.

IIRC, what DF did was a pure FPS benchmark (what I was specifically saying I wasn't doing in my idea), and even then, for some strange reason they ended up using a 7850 vs. a 7870XT, which just doesn't make sense no matter how you slice it.

And while I wholly admit it's not a terribly accurate measure, it's far more interesting than "Well I'm sure there'll be a 50% difference in graphics," "No, it'll be more like a 10% difference in visuals." Lets ditch the silly, nonsensical percentages and actually try to see ways the hardware differences could manifest in games' visuals assuming devs try to push for a similar FPS envelope.
 

TechnicPuppet

Nothing! I said nothing!
IIRC, what DF did was a pure FPS benchmark (what I was specifically saying I wasn't doing in my idea), and even then, for some strange reason they ended up using a 7850 vs. a 7870XT, which just doesn't make sense no matter how you slice it.

And while I wholly admit it's not a terribly accurate measure, it's far more interesting than "Well I'm sure there'll be a 50% difference in graphics," "No, it'll be more like a 10% difference in visuals." Lets ditch the silly, nonsensical percentages and actually try to see ways the hardware differences could manifest in games' visuals assuming devs try to push for a similar FPS envelope.

If it's 50% would it not just end up being 720p vs 1080p ?
 
I believe DF already did and the difference was only a few frames. Mind you, PCs are different beasts than consoles in terms of development, no matter how similar the hardware. You won't see PC games taking advantage of hardware like the X1/PS4.

Its disingenuous to even attempt it let alone use it as any sort of metric.
I'm glad DF's disingenuousness is being recognized, with regards to that reprehensibly handled comparison.

If it's 50% would it not just end up being 720p vs 1080p ?
It's closer to 40%, which is the approximate difference in pixel count between 900p and 1080p. Of course, pixel count isn't everything...
 
D

Deleted member 80556

Unconfirmed Member
It is weird. This thread and others have tons more system wars talk and trolling. I thought the one with the Dev tweet and full explanation of his tweet was totally thread worthy then gets shut down out of nowhere. Who knows...

I thought it was fun + a dev was posting in it, good thread. maybe the dev asked it to be shut down?

I thought that too. That would explain the lack of explanation. When someone tweeted him the thread his response was "oh fuck"

As Nib said, that thread had run its course. People have been 'sensitive' (that's a way of saying it) with console related news. It's getting tiring to be honest. It's essentially become a "rinse and repeat" thing. If anything, the XB1 and PS4 production threads are still open because they are probably the only ones that are talking about the consoles right now.

Still the real fun will come when we see Albert and Major Nelson (if he gets his account activated probably within 24 hours) come and spin that dev's tweet (not to mention other devs statements) into something positive for them

Anyway, sorry for getting slight off topic with this explanation.
 

Finalizer

Member
If it's 50% would it not just end up being 720p vs 1080p ?

Not necessarily. Again, it could manifest in all sorts of ways - depending on the game, how its engine works, what stresses out that particular engine more, what the developers prioritize in visuals (some will always go for 60FPS first, so they might let both versions run at 720p/900p/whateverp), so I don't see there being some blanket optimization used entirely across the board. I fully expect it to be different from game to game, maybe expect some consistency in games that use the same middleware engine. (Cryengine, UE3/4, etc) Though I would expect resolution and IQ settings (AA) to be the first things to get toyed around with since they're the most straightforward settings to play with, but that might not be all that gets messed with, or some devs may choose to change other aspects first (asset fidelity, effects, etc.)

The idea for that test would be to come up with some ideas of what devs might be able to do to achieve a similar performance envelope - just drop resolution, and by how much to keep that same framerate? Mess with IQ? Nix visuals slightly across the board? A slight combination of everything? Again, it'd wouldn't be a test to gain hard evidence of anything, just to give firmer ground for speculation instead of lol percentages from my bumhole.
 

TechnicPuppet

Nothing! I said nothing!
Not necessarily. Again, it could manifest in all sorts of ways - depending on the game, how its engine works, what stresses out that particular engine more, what the developers prioritize in visuals (some will always go for 60FPS first, so they might let both versions run at 720p/900p/whateverp), so I don't see there being some blanket optimization used entirely across the board. I fully expect it to be different from game to game, maybe expect some consistency in games that use the same middleware engine. (Cryengine, UE3/4, etc) Though I would expect resolution and IQ settings (AA) to be the first things to get toyed around with since they're the most straightforward settings to play with, but that might not be all that gets messed with, or some devs may choose to change other aspects first (asset fidelity, effects, etc.)

The idea for that test would be to come up with some ideas of what devs might be able to do to achieve a similar performance envelope - just drop resolution, and by how much to keep that same framerate? Mess with IQ? Nix visuals slightly across the board? A slight combination of everything? Again, it'd wouldn't be a test to gain hard evidence of anything, just to give firmer ground for speculation instead of lol percentages from my bumhole.

I have no technical knowledge so ridicule if I'm being stupid. Would a resolution bump not be the easiest way to use extra power without doing much extra work considering the systems are so similar?

I know they have many options as to how best to use extra power but would a resolution bump be as relatively simple as it is on PC?
 

RoboPlato

I'd be in the dick
I have no technical knowledge so ridicule if I'm being stupid. Would a resolution bump not be the easiest way to use extra power without doing much extra work considering the systems are so similar?

I know they have many options as to how best to use extra power but would a resolution bump be as relatively simple as it is on PC?

Resolution changes would be the easiest options to shift. If there's a PC version that has higher end effects from the console versions and they want to hit 1080p on both console platforms then it would be relatively easy to raise or lower effects to hit the performance target since they'll already have multiple levels of quality programed.
 
I believe DF already did and the difference was only a few frames. Mind you, PCs are different beasts than consoles in terms of development, no matter how similar the hardware. You won't see PC games taking advantage of hardware like the X1/PS4.

Its disingenuous to even attempt it let alone use it as any sort of metric.

DF used a graphics card that was much more power than the one in the Xbox one to represent it. Here's a comparison with graphics cards that are a lot closer to what is actually in the box, though you may not like the numbers.

http://m.neogaf.com/showpost.php?p=74541511&postcount=621
 

Finalizer

Member
I have no technical knowledge so ridicule if I'm being stupid. Would a resolution bump not be the easiest way to use extra power without doing much extra work considering the systems are so similar?

I know they have many options as to how best to use extra power but would a resolution bump be as relatively simple as it is on PC?

Oh yeah for sure, resolution and IQ are the most simple variables to tweak and for the most part are likely the first ones to get shifted. My only point is that devs may not stop there, or that they may chose different paths of optimization because they have different priorities. (e.g. a game is already running at 720p, the devs don't want to lose 60FPS, so they nix visuals in some way instead to keep performance parity) The test would be, in addition to seeing just how much IQ and resolution might be shifted around, to also see what kind of visual difference might come about. I should not the idea wouldn't just to be to get a single performance setting that gains FPS parity, but to test a couple of different methods to see what differences would result within the same FPS envelope, but also how significant those visual differences would end up actually being. (Are they really so significant to be obvious to a casual observer, or will it be fairly miniscule and unremarkable?)
 

TechnicPuppet

Nothing! I said nothing!
Resolution changes would be the easiest options to shift. If there's a PC version that has higher end effects from the console versions and they want to hit 1080p on both console platforms then it would be relatively easy to raise or lower effects to hit the performance target since they'll already have multiple levels of quality programed.

Say BF4, they have said it won't be 1080p on consoles as far as I know.

Would it not just be easier and 'politically' less hassle for the devs to turn off effects on the console versions till the XB1 version runs 60fps at desired resolution then just add 40 to 50 percent more pixels to the PS4 version?
 
DF used a graphics card that was much more power than the one in the Xbox one to represent it. Here's a comparison with graphics cards that are a lot closer to what is actually in the box, though you may not like the numbers.

http://m.neogaf.com/showpost.php?p=74541511&postcount=621

badb0y seems have done a pretty concise comparison of similiar cards a month ago...well, damn, that falls in line with that ~50% performance difference tweet.
 
Of course you don't.

Not sure what that's suppose to mean. Unlike some people here, I've already bought and paid for both systems. I don't exactly want a medal or anything, but I've got nothing against the PS4. I'll be a proud PS4 owner as well as a proud Xbox One owner, but I'm keeping my expectations on the differences in games between these two systems realistic until I actually get a good enough reason to believe otherwise. Some may already be convinced, and that's fine, but I'm not just yet.

I look at things like Ryse on the Xbox One, and am completely blown away by how good it looks. I don't know which game that I've seen on the PS4 can claim to look 50% better graphically than that game, and this is just at launch. I think it's the best looking console game of the next gen so far. Although, being totally honest, I literally just rewatched the vidoc again, and I can't for the life of me comprehend how Crytek has the game looking this good and running on the Xbox One, so I kinda hope there's no PC funny business going on, only to be hit with a less impressive looking game at launch, but that's another discussion entirely.

The point is Ryse looks as good as anything I've actually seen real gameplay of on the PS4, and by that I mean Killzone and Infamous, because those are the huge PS4 standouts for me. I happen to think Quantum Break looks awesome for the X1, but then we haven't exactly seen any gameplay for that, just like we haven't seen any gameplay for the Order, Deep Down. So, as much as I look forward to seeing more from the following games:

Quantum Break
The Order 1866
Deep Down

Not a single one has any real gameplay footage, and the one that has come the closest to showing anything of the sort, based on impressions and footage, has probably been Quantum Break, but cleverly edited scenes in trailers, even if some happen to be snippets of player controllable gameplay, don't count for much. I would need real gameplay at a proper length to really judge. Deep Down and The Order 1866 have shown as much or less than Quantum Break. The general point is, I don't see what's wrong with waiting to see if this performance gap materializes into 40 to 50% better looking games. I have strong doubts, but I would love to be wrong, since, again, I, too, am a future PS4 owner, and would certainly love my money's worth. Only getting Killzone and nothing else for the PS4 at launch for now, but that could easily change depending on the multi-platform differences In games like Watch Dogs, which I'm keeping a very close eye on.

DF used a graphics card that was much more power than the one in the Xbox one to represent it. Here's a comparison with graphics cards that are a lot closer to what is actually in the box, though you may not like the numbers.

http://m.neogaf.com/showpost.php?p=74541511&postcount=621

A 7770 is also quite a bit weaker than the Xbox One GPU. They may be closer in TFLOPs, but the 7770 has half the triangle setup rate, less video memory, less than 50% the amount of memory bandwidth that's available to the Xbox One GPU, and potentially even less if the max theoretical figure for the ESRAM is even remotely true.
 

RoboPlato

I'd be in the dick
I look at things like Ryse on the Xbox One, and am completely blown away by how good it looks. I don't know which game that I've seen on the PS4 can claim to look 50% better graphically than that game, and this is just at launch. I think it's the best looking console game of the next gen so far. Although, being totally honest, I literally just rewatched the vidoc again, and I can't for the life of me comprehend how Crytek has the game looking this good and running on the Xbox One, so I kinda hope there's no PC funny business going on, only to be hit with a less impressive looking game at launch, but that's another discussion entirely.

Have they confirmed if Ryse is native 1080p yet? I know MS hasn't been releasing proper quality videos for us to judge for ourselves. Blim is really annoyed at that. I hope they're not doing any PC footage this close to launch. I think the game looks great graphically as well and I'm also curious as to how they're pulling it off at launch. Even for Crytek it's quite a visual feat.
 

TechnicPuppet

Nothing! I said nothing!
Oh yeah for sure, resolution and IQ are the most simple variables to tweak and for the most part are likely the first ones to get shifted. My only point is that devs may not stop there, or that they may chose different paths of optimization because they have different priorities. (e.g. a game is already running at 720p, the devs don't want to lose 60FPS, so they nix visuals in some way instead to keep performance parity) The test would be, in addition to seeing just how much IQ and resolution might be shifted around, to also see what kind of visual difference might come about. I should not the idea wouldn't just to be to get a single performance setting that gains FPS parity, but to test a couple of different methods to see what differences would result within the same FPS envelope, but also how significant those visual differences would end up actually being. (Are they really so significant to be obvious to a casual observer, or will it be fairly miniscule and unremarkable?)

I hope some sort of pattern emerges because I am buying both consoles and I need to weigh up a lot things in my head. I don't want to have to wait for DF faceoffs to buy games.
 
Have they confirmed if Ryse is native 1080p yet? I know MS hasn't been releasing proper quality videos for us to judge for ourselves. Blim is really annoyed at that. I hope they're not doing any PC footage this close to launch. I think the game looks great graphically as well and I'm also curious as to how they're pulling it off at launch. Even for Crytek it's quite a visual feat.

No idea, but it certainly looks 1080p to me when I watch the new vidoc on my HDTV, but, again, I'm a bit nervous about that actually being Xbox One footage, and not some powerful PC.

Stop messin' with SenjutsuSage, dude.
He will kill you with his looong looooooooooong posts, tellin' you how amazing the XBone is.

lol posters like you are why we can't have more respectful discussions on here. Do posters like you ever contribute to a thread in a way that isn't mocking somebody or trolling in some fashion? :)
 

Bundy

Banned
lol posters like you are why we can't have more respectful discussions on here. Do posters like you ever contribute to a thread in a way that isn't mocking somebody or trolling in some fashion? :)
Well it's not my job, like yours, to convince people to get a XBone.
I already made up my mind about the PS4 and the XBone.
People are attacking me when I say "PS4 is more powerful".
Well, they can. I'm still tellin' the truth, you know. ;)
 

The Flash

Banned
Well it's not my job, like yours, to convince people to get a XBone.
I already made up my mind about the PS4 and the XBone.
People are attacking me when I say "PS4 is more powerful".
Well, they can. I'm still tellin' the truth, you know. ;)

But the question is...is there a less passive aggressive way for you to do that?

qyKscVD.gif


I like you in Modern Family so don't yell at me lol
 
Well it's not my job, like yours, to convince people to get a XBone.
I already made up my mind about the PS4 and the XBone.
People are attacking me when I say "PS4 is more powerful".
Well, they can. I'm still tellin' the truth, you know. ;)

I can assure you, it isn't my job. And if it was, all of GAF would be converted, because I can be awfully persuasive.

sp_1309_clip04.jpg


I've never attacked anyone for saying the PS4 is more powerful. I just regularly caution people from assuming that just because the PS4 is more powerful, it must somehow also mean the Xbox One is weak I don't deny the performance edge of the PS4. I just have my doubts that it will materialize in the extreme ways that some people believe. Literally all a developer has to do with their Xbox One version is to lower their resolution. That's my belief, and very few gamers would be able to see the resolution decrease. If I end up wrong, I'll gladly eat some crow.
 

Finalizer

Member
I hope some sort of pattern emerges because I am buying both consoles and I need to weigh up a lot things in my head. I don't want to have to wait for DF faceoffs to buy games.

I really have no idea if a specific pattern will emerge. There's never been a console generation where two consoles were so similar in architecture, so it's hard to say what's really going to happen. I would expect resolution to be the main difference in most cases, and that's assuming 3rd parties really leverage the power difference between the consoles.

For what it's worth, this test idea isn't meant as a means to prove there'll be a major or minor difference between 3rd party games graphically; the point would be to poke around and see what are some possibilities instead of sitting around and throwing vague ideas of what the graphical differences could be.

And just to make another point, I really can't see any scenarios where devs let the Xbone version have significantly worse FPS performance - the comparisons to the 360 vs PS3 are a bit misguided since a lot of those performance issues came down to architectural differences between the systems, hence it took significant extra development time to get a game up and running similarly on the PS3; by comparison, the Xbone is far more straightforward to work with, probably even moreso than the 360. So while I'd say sticking with PS4 versions would be your best bet for multiplat games, if you have some reason you'd rather have the games on the 'Bone (friends on that console, controller preference, whatever), I can't see that one being so much worse that anyone would feel handicapped by owning that particular version instead of the PS4 copy.
 
I really have no idea if a specific pattern will emerge. There's never been a console generation where two consoles were so similar in architecture, so it's hard to say what's really going to happen. I would expect resolution to be the main difference in most cases, and that's assuming 3rd parties really leverage the power difference between the consoles.

For what it's worth, this test idea isn't meant as a means to prove there'll be a major or minor difference between 3rd party games graphically; the point would be to poke around and see what are some possibilities instead of sitting around and throwing vague ideas of what the graphical differences could be.

And just to make another point, I really can't see any scenarios where devs let the Xbone version have significantly worse FPS performance - the comparisons to the 360 vs PS3 are a bit misguided since a lot of those performance issues came down to architectural differences between the systems, hence it took significant extra development time to get a game up and running similarly on the PS3; by comparison, the Xbone is far more straightforward to work with, probably even moreso than the 360. So while I'd say sticking with PS4 versions would be your best bet for multiplat games, if you have some reason you'd rather have the games on the 'Bone (friends on that console, controller preference, whatever), I can't see that one being so much worse that anyone would feel handicapped by owning that particular version instead of the PS4 copy.

Fantastic post. I'm leaning towards Xbox One version for certain MP games, because literally that's what some of my friends are getting, but mostly because I prefer the controller. However, no way I'm buying a crappy version that the PS4 version possibly runs circles around, so I'll make certain first before I purchase. Only MP titles I'm guaranteed to get on the Xbox One are the sports titles, but everything else I'll wait to see what the differences are first.
 
third parties will definitely keep framerate the same and drop resolution and effects for the Xbox One.

You're going to be left with worse image quality resolution, textures, aliasing, DoF, etc.

I think the differences will be more noticeable than PS3/360 ports, but the games won't be "bad" on the Xbox One...I think performance will be fine.

I just hope devs target performance for PS4 first and then scale down for Xbox One.
 
third parties will definitely keep framerate the same and drop resolution and effects for the Xbox One.

You're going to be left with worse image quality resolution, textures, aliasing, DoF, etc.

I think the differences will be more noticeable than PS3/360 ports, but the games won't be "bad" on the Xbox One...I think performance will be fine.

I just hope devs target performance for PS4 first and then scale down for Xbox One.

I doubt you'll be missing that many stuff on the Xbox One version. A decent resolution decrease alone tends to give the weaker GPU the performance required to easily have practically the same overall graphics quality with a variety of features. Decent AA seems a safe bet for most PS4 multi-platforms.
 

TechnicPuppet

Nothing! I said nothing!
I really have no idea if a specific pattern will emerge. There's never been a console generation where two consoles were so similar in architecture, so it's hard to say what's really going to happen. I would expect resolution to be the main difference in most cases, and that's assuming 3rd parties really leverage the power difference between the consoles.

For what it's worth, this test idea isn't meant as a means to prove there'll be a major or minor difference between 3rd party games graphically; the point would be to poke around and see what are some possibilities instead of sitting around and throwing vague ideas of what the graphical differences could be.

And just to make another point, I really can't see any scenarios where devs let the Xbone version have significantly worse FPS performance - the comparisons to the 360 vs PS3 are a bit misguided since a lot of those performance issues came down to architectural differences between the systems, hence it took significant extra development time to get a game up and running similarly on the PS3; by comparison, the Xbone is far more straightforward to work with, probably even moreso than the 360. So while I'd say sticking with PS4 versions would be your best bet for multiplat games, if you have some reason you'd rather have the games on the 'Bone (friends on that console, controller preference, whatever), I can't see that one being so much worse that anyone would feel handicapped by owning that particular version instead of the PS4 copy.

I would rather stick mostly to one system and for various reasons it would be the Xbox all things being equal.

I would need to test out resolution to see if I noticed much. Bad frame rate issues or extra screen tearing would be to much though, that would make me get every multi plat on PS4.
 
I would rather stick mostly to one system and for various reasons it would be the Xbox all things being equal.

I would need to test out resolution to see if I noticed much. Bad frame rate issues or extra screen tearing would be to much though, that would make me get every multi plat on PS4.

Yep, bad framerate and a lot of tearing is a no no for me this gen.
 

vpance

Member
Literally all a developer has to do with their Xbox One version is to lower their resolution. That's my belief, and very few gamers would be able to see the resolution decrease. If I end up wrong, I'll gladly eat some crow.

GAF gamers would see that res decrease. And we wouldn't let Xbone get away with it that easily, by spreading the word ;)
 
GAF gamers would see that res decrease. And we wouldn't let Xbone get away with it that easily, by spreading the word ;)

Even gaffers can't see the decrease... We literally wait for DF or pixel counters to tell us which games aren't even running at 720p, from which ones were. And spreading the word doesn't really affect people that are already playing the game and enjoying it regardless of what the resolution happens to be. Almost nobody on GAF caught on to the fact that COD4 wasn't 720p until pixel counters told them so. It took pixel counters for people to find out Halo 3 and Reach weren't 720p, and many more examples. Although, to me, it was very obvious that something was up with Halo 3 from the get go, but I didn't know what it was specifically.

And it's going to be even harder to see the decrease this gen, since games will likely be utilizing dynamic resolutions. It's a Directx 11.2 feature and perfect for the display planes on the Xbox One. In fact, I think the Directx 11.2 feature exactly describes the Xbox One's display planes. It'll be so much easier to hide these things this gen. We will have to rely on pixel counters yet again. For example, nobody realized that Killer Instinct wasn't 1080p until a dev apparently said so, although I don't know if it was ever confirmed. I have doubts that Ryse can look as good as it does on the Xbox One while being 1080p, but I guess we'll see.

Also, someone link me to the uncompressed and high quality vid of Infamous. I need to show a relative that's visiting this game, but I don't want him to see some compressed youtube crap. He wants to just watch it on youtube, but I'm not having any of it. Poor fool doesn't know any better. :)
 
Even gaffers can't see the decrease... We literally wait for DF or pixel counters to tell us which games aren't even running at 720p, from which ones were. And spreading the word doesn't really affect people that are already playing the game and enjoying it regardless of what the resolution happens to be. Almost nobody on GAF caught on to the fact that COD4 wasn't 720p until pixel counters told them so. It took pixel counters for people to find out Halo 3 and Reach weren't 720p, and many more examples. Although, to me, it was very obvious that something was up with Halo 3 from the get go, but I didn't know what it was specifically.

And it's going to be even harder to see the decrease this gen, since games will likely be utilizing dynamic resolutions. It's a Directx 11.2 feature and perfect for the display planes on the Xbox One. In fact, I think the Directx 11.2 feature exactly describes the Xbox One's display planes. It'll be so much easier to hide these things this gen. We will have to rely on pixel counters yet again. For example, nobody realized that Killer Instinct wasn't 1080p until a dev apparently said so, although I don't know if it was ever confirmed. I have doubts that Ryse can look as good as it does on the Xbox One while being 1080p, but I guess we'll see.

Also, someone link me to the uncompressed and high quality vid of Infamous. I need to show a relative that's visiting this game, but I don't want him to see some compressed youtube crap. He wants to just watch it on youtube, but I'm not having any of it. Poor fool doesn't know any better. :)

You're severely underestimating the impact of resolution.

Sure, many people can't tell exactly what is 720p and what isn't.

But if they are looking at the same game side by side, one 720p and one sub 720p you can EASILY tell a difference.

The image is softer, and as a result the textures aren't as sharp and jaggies can appear more pronounced,

Many 360 games looked much sharper than PS3 titles thanks to better resolution. Your post is pretty disingenuous.
 
Status
Not open for further replies.
Top Bottom