EmptySpace
Banned
I wonder if KI coiuld be be possible 1080p on ps4?
ki is not 1080p? what?
I wonder if KI coiuld be be possible 1080p on ps4?
That difference was like the one between 360/Ps3 multiplats.EvB said:Also didn't Gamecube games generally look and run significantly better than their PS2 counterparts?
ki is not 1080p? what?
ki is not 1080p? what?
That difference was like the one between 360/Ps3 multiplats.
Excellent idea, someone needs to do this.For what it's worth, I still find the assertion of there being [x] percentage difference in graphical fidelity between different platform versions of a game to be a ridiculous notion. It's one thing to quantify the overall hardware muscle of a system - you've got actual numbers to quantify the difference between systems in different categories, with which you can sum up some vague, but not wholly inaccurate, general numerical difference between the systems' performance capabilities, but this whole idea of quantifying graphics is just silly when that can manifest in all sorts of ways, ways that are even purely subjective to one's tastes (from another thread, "I prefer PS1 graphics over PS2 graphics because fuck you it's my opinion"), and even ways that don't show up in the visuals rendered on the screen. (e.g. a PS4 version is less optimized because the extra horsepower is simply used to brute-force performance to achieve performance parity with the Xbone version; "lazy devs") So saying something like "these graphics are [x] percent better than those" just sounds silly to me.
I think what'd be more interesting by this point is if someone did a test to see what kind of different visual enhancements you can get out of a game when you swap between a 7770 and a 7850. That is, take some basic system (maybe try to make it at least somewhat similar to the power of the consoles), put in a 7770, take some game (Crysis 3 is a fun benchmark so let's say our fantasy test uses that), change settings around to get a fairly stable 30FPS in most gameplay scenarios (have some demos handy for testing purposes), then swap the 7770 with a 7850, and adjust the game settings to see what amount of enhancements you can add while still getting that same stable 30FPS framerate. The idea would be to make some general measures of what kind of differences could manifest between the two systems assuming 3rd party devs really try to leverage it; as an alternate take to a pure FPS comparison in something like these benchmarks, instead seeing what improvements can be made within a similar FPS envelope.
Naturally, it's really not all that necessary given it would be inherently inaccurate compared to software specifically coded for these consoles, and given that we're only a few more months from these things launching, but it'd still be something interesting to see in the meantime.
So, uh, not sure where to ask this, but what's the deal with some of the threads discussing the hardware differences between the 'Bone/PS4 getting locked now? Are we simply not allowed to bring those tweets into any discussion around here for some reason? Or was it just a "these threads have run their course" thing? Just want some clarification on this, since usually there's some mod note when a thread is closed, but in this case I'm left uncertain.
Gemüsepizza;80554185 said:Good question. A short post from a mod in those closed threads would probably prevent further confusion.
It is weird. This thread and others have tons more system wars talk and trolling. I thought the one with the Dev tweet and full explanation of his tweet was totally thread worthy then gets shut down out of nowhere. Who knows...
I thought it was fun + a dev was posting in it, good thread. maybe the dev asked it to be shut down?
ki is not 1080p? what?
Ya. That's a very good question considering that every PS4 game ever made will run in 1080p and 60fps.I wonder if KI coiuld be be possible 1080p on ps4?
I like the cut of your jib.Ya. That's a very good question considering that every PS4 game ever made will run in 1080p and 60fps.
Ya. That's a very good question considering that every PS4 game ever made will run in 1080p and 60fps.
RE4 was one game out of hundreds. And I could single out several(not many,but more than one) similar cases on 360->ps3.Finalizer said:An old friend of mine would've told you a different story about RE4...
I believe DF already did and the difference was only a few frames. Mind you, PCs are different beasts than consoles in terms of development, no matter how similar the hardware. You won't see PC games taking advantage of hardware like the X1/PS4.Excellent idea, someone needs to do this.
Power. I hat this word. Let's just call it processor uptime, shall we? It was also a reference to CPU specifically, not overall throughput. At 1080p/60 Resogun does look mighty glorious considering those aren't particle effects - those are voxels. Lots of physics being thrown around gloriously.They are a pretty small dev if I'm not mistaken. Hell a 2D shooter is using 50% of the PS4's power. Doesn't really mean much in the grand scheme of things.
I believe DF already did and the difference was only a few frames. Mind you, PCs are different beasts than consoles in terms of development, no matter how similar the hardware. You won't see PC games taking advantage of hardware like the X1/PS4.
Its disingenuous to even attempt it let alone use it as any sort of metric.
IIRC, what DF did was a pure FPS benchmark (what I was specifically saying I wasn't doing in my idea), and even then, for some strange reason they ended up using a 7850 vs. a 7870XT, which just doesn't make sense no matter how you slice it.
And while I wholly admit it's not a terribly accurate measure, it's far more interesting than "Well I'm sure there'll be a 50% difference in graphics," "No, it'll be more like a 10% difference in visuals." Lets ditch the silly, nonsensical percentages and actually try to see ways the hardware differences could manifest in games' visuals assuming devs try to push for a similar FPS envelope.
I'm glad DF's disingenuousness is being recognized, with regards to that reprehensibly handled comparison.I believe DF already did and the difference was only a few frames. Mind you, PCs are different beasts than consoles in terms of development, no matter how similar the hardware. You won't see PC games taking advantage of hardware like the X1/PS4.
Its disingenuous to even attempt it let alone use it as any sort of metric.
It's closer to 40%, which is the approximate difference in pixel count between 900p and 1080p. Of course, pixel count isn't everything...If it's 50% would it not just end up being 720p vs 1080p ?
It is weird. This thread and others have tons more system wars talk and trolling. I thought the one with the Dev tweet and full explanation of his tweet was totally thread worthy then gets shut down out of nowhere. Who knows...
I thought it was fun + a dev was posting in it, good thread. maybe the dev asked it to be shut down?
I thought that too. That would explain the lack of explanation. When someone tweeted him the thread his response was "oh fuck"
If it's 50% would it not just end up being 720p vs 1080p ?
Not necessarily. Again, it could manifest in all sorts of ways - depending on the game, how its engine works, what stresses out that particular engine more, what the developers prioritize in visuals (some will always go for 60FPS first, so they might let both versions run at 720p/900p/whateverp), so I don't see there being some blanket optimization used entirely across the board. I fully expect it to be different from game to game, maybe expect some consistency in games that use the same middleware engine. (Cryengine, UE3/4, etc) Though I would expect resolution and IQ settings (AA) to be the first things to get toyed around with since they're the most straightforward settings to play with, but that might not be all that gets messed with, or some devs may choose to change other aspects first (asset fidelity, effects, etc.)
The idea for that test would be to come up with some ideas of what devs might be able to do to achieve a similar performance envelope - just drop resolution, and by how much to keep that same framerate? Mess with IQ? Nix visuals slightly across the board? A slight combination of everything? Again, it'd wouldn't be a test to gain hard evidence of anything, just to give firmer ground for speculation instead of lol percentages from my bumhole.
I have no technical knowledge so ridicule if I'm being stupid. Would a resolution bump not be the easiest way to use extra power without doing much extra work considering the systems are so similar?
I know they have many options as to how best to use extra power but would a resolution bump be as relatively simple as it is on PC?
I believe DF already did and the difference was only a few frames. Mind you, PCs are different beasts than consoles in terms of development, no matter how similar the hardware. You won't see PC games taking advantage of hardware like the X1/PS4.
Its disingenuous to even attempt it let alone use it as any sort of metric.
I have no technical knowledge so ridicule if I'm being stupid. Would a resolution bump not be the easiest way to use extra power without doing much extra work considering the systems are so similar?
I know they have many options as to how best to use extra power but would a resolution bump be as relatively simple as it is on PC?
Resolution changes would be the easiest options to shift. If there's a PC version that has higher end effects from the console versions and they want to hit 1080p on both console platforms then it would be relatively easy to raise or lower effects to hit the performance target since they'll already have multiple levels of quality programed.
DF used a graphics card that was much more power than the one in the Xbox one to represent it. Here's a comparison with graphics cards that are a lot closer to what is actually in the box, though you may not like the numbers.
http://m.neogaf.com/showpost.php?p=74541511&postcount=621
Of course you don't.
DF used a graphics card that was much more power than the one in the Xbox one to represent it. Here's a comparison with graphics cards that are a lot closer to what is actually in the box, though you may not like the numbers.
http://m.neogaf.com/showpost.php?p=74541511&postcount=621
I wonder if KI coiuld be be possible 1080p on ps4?
I look at things like Ryse on the Xbox One, and am completely blown away by how good it looks. I don't know which game that I've seen on the PS4 can claim to look 50% better graphically than that game, and this is just at launch. I think it's the best looking console game of the next gen so far. Although, being totally honest, I literally just rewatched the vidoc again, and I can't for the life of me comprehend how Crytek has the game looking this good and running on the Xbox One, so I kinda hope there's no PC funny business going on, only to be hit with a less impressive looking game at launch, but that's another discussion entirely.
Oh yeah for sure, resolution and IQ are the most simple variables to tweak and for the most part are likely the first ones to get shifted. My only point is that devs may not stop there, or that they may chose different paths of optimization because they have different priorities. (e.g. a game is already running at 720p, the devs don't want to lose 60FPS, so they nix visuals in some way instead to keep performance parity) The test would be, in addition to seeing just how much IQ and resolution might be shifted around, to also see what kind of visual difference might come about. I should not the idea wouldn't just to be to get a single performance setting that gains FPS parity, but to test a couple of different methods to see what differences would result within the same FPS envelope, but also how significant those visual differences would end up actually being. (Are they really so significant to be obvious to a casual observer, or will it be fairly miniscule and unremarkable?)
Of course you don't.
Have they confirmed if Ryse is native 1080p yet? I know MS hasn't been releasing proper quality videos for us to judge for ourselves. Blim is really annoyed at that. I hope they're not doing any PC footage this close to launch. I think the game looks great graphically as well and I'm also curious as to how they're pulling it off at launch. Even for Crytek it's quite a visual feat.
Stop messin' with SenjutsuSage, dude.
He will kill you with his looong looooooooooong posts, tellin' you how amazing the XBone is.
Well it's not my job, like yours, to convince people to get a XBone.lol posters like you are why we can't have more respectful discussions on here. Do posters like you ever contribute to a thread in a way that isn't mocking somebody or trolling in some fashion?![]()
Well it's not my job, like yours, to convince people to get a XBone.
I already made up my mind about the PS4 and the XBone.
People are attacking me when I say "PS4 is more powerful".
Well, they can. I'm still tellin' the truth, you know.![]()
Well it's not my job, like yours, to convince people to get a XBone.
I already made up my mind about the PS4 and the XBone.
People are attacking me when I say "PS4 is more powerful".
Well, they can. I'm still tellin' the truth, you know.![]()
If its a 2d fighting exclusive game , then I hope so.
I hope some sort of pattern emerges because I am buying both consoles and I need to weigh up a lot things in my head. I don't want to have to wait for DF faceoffs to buy games.
I really have no idea if a specific pattern will emerge. There's never been a console generation where two consoles were so similar in architecture, so it's hard to say what's really going to happen. I would expect resolution to be the main difference in most cases, and that's assuming 3rd parties really leverage the power difference between the consoles.
For what it's worth, this test idea isn't meant as a means to prove there'll be a major or minor difference between 3rd party games graphically; the point would be to poke around and see what are some possibilities instead of sitting around and throwing vague ideas of what the graphical differences could be.
And just to make another point, I really can't see any scenarios where devs let the Xbone version have significantly worse FPS performance - the comparisons to the 360 vs PS3 are a bit misguided since a lot of those performance issues came down to architectural differences between the systems, hence it took significant extra development time to get a game up and running similarly on the PS3; by comparison, the Xbone is far more straightforward to work with, probably even moreso than the 360. So while I'd say sticking with PS4 versions would be your best bet for multiplat games, if you have some reason you'd rather have the games on the 'Bone (friends on that console, controller preference, whatever), I can't see that one being so much worse that anyone would feel handicapped by owning that particular version instead of the PS4 copy.
third parties will definitely keep framerate the same and drop resolution and effects for the Xbox One.
You're going to be left with worse image quality resolution, textures, aliasing, DoF, etc.
I think the differences will be more noticeable than PS3/360 ports, but the games won't be "bad" on the Xbox One...I think performance will be fine.
I just hope devs target performance for PS4 first and then scale down for Xbox One.
I really have no idea if a specific pattern will emerge. There's never been a console generation where two consoles were so similar in architecture, so it's hard to say what's really going to happen. I would expect resolution to be the main difference in most cases, and that's assuming 3rd parties really leverage the power difference between the consoles.
For what it's worth, this test idea isn't meant as a means to prove there'll be a major or minor difference between 3rd party games graphically; the point would be to poke around and see what are some possibilities instead of sitting around and throwing vague ideas of what the graphical differences could be.
And just to make another point, I really can't see any scenarios where devs let the Xbone version have significantly worse FPS performance - the comparisons to the 360 vs PS3 are a bit misguided since a lot of those performance issues came down to architectural differences between the systems, hence it took significant extra development time to get a game up and running similarly on the PS3; by comparison, the Xbone is far more straightforward to work with, probably even moreso than the 360. So while I'd say sticking with PS4 versions would be your best bet for multiplat games, if you have some reason you'd rather have the games on the 'Bone (friends on that console, controller preference, whatever), I can't see that one being so much worse that anyone would feel handicapped by owning that particular version instead of the PS4 copy.
I would rather stick mostly to one system and for various reasons it would be the Xbox all things being equal.
I would need to test out resolution to see if I noticed much. Bad frame rate issues or extra screen tearing would be to much though, that would make me get every multi plat on PS4.
Literally all a developer has to do with their Xbox One version is to lower their resolution. That's my belief, and very few gamers would be able to see the resolution decrease. If I end up wrong, I'll gladly eat some crow.
On PS4? Haha not in this lifetime pal.
GAF gamers would see that res decrease. And we wouldn't let Xbone get away with it that easily, by spreading the word![]()
Even gaffers can't see the decrease... We literally wait for DF or pixel counters to tell us which games aren't even running at 720p, from which ones were. And spreading the word doesn't really affect people that are already playing the game and enjoying it regardless of what the resolution happens to be. Almost nobody on GAF caught on to the fact that COD4 wasn't 720p until pixel counters told them so. It took pixel counters for people to find out Halo 3 and Reach weren't 720p, and many more examples. Although, to me, it was very obvious that something was up with Halo 3 from the get go, but I didn't know what it was specifically.
And it's going to be even harder to see the decrease this gen, since games will likely be utilizing dynamic resolutions. It's a Directx 11.2 feature and perfect for the display planes on the Xbox One. In fact, I think the Directx 11.2 feature exactly describes the Xbox One's display planes. It'll be so much easier to hide these things this gen. We will have to rely on pixel counters yet again. For example, nobody realized that Killer Instinct wasn't 1080p until a dev apparently said so, although I don't know if it was ever confirmed. I have doubts that Ryse can look as good as it does on the Xbox One while being 1080p, but I guess we'll see.
Also, someone link me to the uncompressed and high quality vid of Infamous. I need to show a relative that's visiting this game, but I don't want him to see some compressed youtube crap. He wants to just watch it on youtube, but I'm not having any of it. Poor fool doesn't know any better.![]()
But if they are looking at the same game side by side, one 720p and one sub 720p you can EASILY tell a difference.
.