• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ubisoft: Watch_Dogs will run at 900p on PS4 and 792p on XB1, both at 30fps

TyrantII

Member
I don't think gtav was a solid 30 fps. Hence why I can't wait for a ps4 version that hopefully has 60 fps and/or higher resolution textures/1080p

Not even close. It dropped to the teens and on very slim occasion the single digits.

Yet it still was better than GTA4; resolution, performance, and assets/shaders wise.
 

TyrantII

Member
Can you give an example of game franchises where that happened? Games like GTAV actually got a resolution boost as the devs became more familiar with the hardware.


They can't, because for most devs it never happened. Maybe with some shovelware.

Keeps popping up though. Gotta explain those low end APUs away.
 

TAJ

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
In fact, it will actually make the iq worse than 720 due to improper scaling.

They both divide evenly by 9.
792p will look cleaner and more detailed after being scaled straight to 1080p.
 
It's the fact that they said themselves that it was full HD that pisses me off, 900p is not full HD. If Ubisoft keep up with their bullshit I will not buy any of their games, no matter how good they are (I have EA on my blacklist, lol).



I think maybe when the consoles first launched and this game was meant to, but now? Nah, I think hype has died down a hell of a lot for this game (I was never hyped for it, but I want the stuff that comes with it lol).

I think GAF collectively is very wrong about this game, and has allowed things that really shouldn't weigh all too heavily against the game to successfully paint this aura of mediocrity and negativity that persists despite the fact that I don't really feel there has been nearly enough shown to warrant such an overreaction. I've always gotten the impression that this is a game that's hard to show off when you're not in the more exciting missions.

There are parts that look like they could use work, such as the animation when he's holding a cell phone in his hand in some of the cutscenes I've seen, but I think the rest looks perfectly fine. I still see every bit as much potential, if not more, for a game every bit as amazing as the initial E3 trailer implied it could be. The graphics somehow not seeming as awe inspiring doesn't really do much to dampen my excitement.
 
I played Dragons Dogma with black bars... the whole game... they were there. Yes people decried the decision and raged against the machine. And many, including myself wound up loving the game (and it's sequel.... please???) This pixel counting perversion infects threads on every new game that comes out. Features, gameplay, and mission discussions are not overwhelmed by graphics and resolution screenshot wars. I want to knock on the walls and test the foundation of the house, not stand on the lawn arguing about the color of the paint.
 
In fact, it will actually make the iq worse than 720 due to improper scaling.

why the heck would you believe this? unless you have a native 720p fixed pixel display (and you almost certainly don't) 792p is always going to look better scaled to 1080p than 720p. the only resolution that are special are your native display, or one that is just a linear scale. anything else, more P's = more detail.

It doesn't work that way. We have already been over this in the Titanfall threads and dark10x tested and confirmed it.
What was his methodology? Are we talking about setting the Xbox One to output at 720p, so that your TV handles the scaling vs having the Xbox One set to 1080p so it handles the scaling? That doesn't make what you think true.
 
why the heck would you believe this? unless you have a native 720p fixed pixel display (and you almost certainly don't) 792p is always going to look better scaled to 1080p than 720p. the only resolution that are special are your native display, or one that is just a linear scale. anything else, more P's = more detail.


What was his methodology? Are we talking about setting the Xbox One to output at 720p, so that your TV handles the scaling vs having the Xbox One set to 1080p so it handles the scaling? That doesn't make what you think true.

Did you seriously just say 72 more pixels? :p

It's actually 193,536 more pixels.

nice try. :) How bad do some people think people's televisions are?


Here you go.

http://www.neogaf.com/forum/showpost.php?p=103680287&postcount=1202

http://www.neogaf.com/forum/showpost.php?p=103689488&postcount=1267
 

Yoday

Member
Didn't someone say they didn't use any of the delay to improve graphics but to fix gameplay problems? I remember in one of the threads where people thought along the lines of, "What have they been doing all this time? Surely they improved the graphics."
They mention it in the blog post where they announced the resolution and framerate. Something along the lines of the delay took place to expand the gameplay, not to improve the visuals or performance. I'm sure they did some work on it, but only to clean up the game on the settings they already had in place. I'm sure it runs smoother and has better AA and such than it would have at launch, but they didn't go as far as to optimize to the point of being able to increase the resolution. I wouldn't doubt it if they lost a number of team members to other projects after the delay, and members of the tech team could have been among them.

Do any of the games you listed have a locked target framerate? Is TLoU or GTAV a solid 30? Or the God of War games a solid 60?
Why does that matter at all? The point is that those games look better, while outputting a higher resolution than similar games earlier in the generation while performing pretty much the same in terms of framerate. Being locked to 30 or 60 is irrelevant if the games performed on par or better than the previous entries.
 

I've read it, but still don't really buy it. I don't think it applies nearly as accurately across the board as he seems to think. There's way too much that can be going on behind the scenes that we simply aren't aware of, and to use such an example as that as this one size fits all formula for how things will be on every television isn't something I can take very seriously.

I don't normally say this about Dark's stuff, but he's wrong here. A flat shape is a terrible thing to choose for upscaling comparisons.

Even without using something like that, there's just so many extra factors, such as the possible inclusion of some level of MSAA and who knows what else.
 

elhav

Member
When I saw Ubi's minimum requirements for the PC version I immediately suspected it can't be 60 fps and 1080p on the PS4/Xbone.

I still think the PS4 version will perform better than my PC lol, this game has crazy demanding requirements.
 

Papercuts

fired zero bullets in the orphanage.
It doesn't work that way. We have already been over this in the Titanfall threads and dark10x tested and confirmed it.

It was also tested using other examples.

HlNG6hE.gif


zsP8A8I.gif
 
scalingcomparison100x7fsz.png

scalingcomparison8kecj.png


I scaled my avatar down 66% and back up to the same size, and I scaled it down to 73% and I scaled it back up to the same size using a very dirty scale. You can clearly see more detail is lost in the image on the left than the one in the middle.

Max's eye looks too big in the one on the left. The ugly sharp edge on the top of the morphball is lost and it just looks perfectly round in the 720p example. The detail on the side of the morphball is much more unclear. The circle on it is obviously a circle at 792p but not 720p. The texture of the carpet and the jpeg artifacting is clearer in the 792p one. You cannot make out the headset wire at all in the 720p example, only it's shadow. It is faintly visible in the 792p example.

Both the middle one and the one on the left look much softer (as you would expect, but the one in the middle maintains more of the detail from the original.

Again, his example, which just focused on a single shape only brings our attention to the outline. It doesn't show us what happens to fine detail at all. 792p has more fine detail, and the 36% upscale doesn't lose that extra detail that it has over the one upscaled by 50%.
 

MavFan619

Banned
I can't lie I'm bummed about 900p since aside from BF4 its not a widespread issue (yet) on PS4, mostly because the game didn't seem that demanding graphically, if something like Witcher 3 winds up being 900p I could see why. I also don't have a gaming pc or the funds to build one anytime soon so that's not an option.
 

KiraXD

Member
Agreed. I think people should just accept that the game will be 900p at this point.

you think people should just accept this? its not acceptable at all... if people decided with their wallets, maybe devs would take a little extra time to think: "Hey, should we release this game @ 900p or work it a bit to get 1080?"

Id bet if people just said, "no, this is unacceptable." and waited to purchase watchdogs instead of running out and settling for it on launch day... the devs would see the sales and try to fix it by possibly giving it a patch.
 

Demon Ice

Banned
792? Why not just use 720p and call it a day. It's not like 72 more pixels will change anything. :/

I think at least part of the issue is 720p is associated with last gen, so MS wants to avoid that at all costs.

Titanfall ("Frame rate is king") went with 792p instead of 720p and the result was a game that tears frames and drops from 60 FPS on a regular basis.
 
70 pages about frames per sec? At what point do you realize you've been discussing nonsense. The only question that should matter is if the game is fun or not. This is the same crowd that plays and loves minecraft, and other 8 to 16 bit graphics games and you never once cared about frames per sec
 

Ysiadmihi

Banned
Why does that matter at all? The point is that those games look better, while outputting a higher resolution than similar games earlier in the generation while performing pretty much the same in terms of framerate. Being locked to 30 or 60 is irrelevant if the games performed on par or better than the previous entries.

It shows that even with "coding to the metal" and years of experience with the hardware, you can't meet target framerates, resolution and provide modernized effects all at the same time. And the 360/PS3 were powerful machines for their respective launch years.

There's less to learn and less room to grow this time around.
 
Well, fair enough. If that's what's behind the curtain, it will be exposed as such, but as of right now, I see a most intriguing game with lots of interesting possibilities that I've never had in other games before. I don't know if it's due to the expectations of how powerful the new consoles would be, and this is why we're overlooking what this game is trying to do, but what if Watch Dogs, instead of being from Ubisoft, was just some epic indie title? I really do wonder if the reactions here would be the same.

One of the most noted PR tactic AAA publishers like to use is present a core gameplay "feature" then enact it in it's best possible light in order to create buzz to fill their own hype-bubble. Which is pretty standard as you want to need to rack up enough hype to justify the expenses on the project.

Where the problem lies, is that when updates upon updates that barely extrapolates on said feature; reducing it to a mere backdrop wrapped by the menagerie of blockbuster tropes that take popular cues from pop culture that is applicable to their game. That is the main reason why so many are complaining about discrepancy between the the reveal and the final promo material (IMO). It is as if Ubisoft is presenting 2 very different games hoping the recent PR would serve to alleviate any concerns about it's presentation. As you can see, this tactic clearly backfired as consumers today are far more wary that gives precedence to cynicism. Nevermind the downgrades on graphics which they PR placed themselves on.

Is it because it's one of the mega publishers that we're not taking the time to appreciate what may possibly be a game that's doing some genuinely different things from what we've come to expect over the years from action adventure open world titles? What if the game were 1080p and 60fps on both machines, but they failed miserably on their vision and it isn't even close to delivering on the "connected world, hack anything" vision they promised? Would we be hailing Ubisoft and the dev team right now for their incredible work? The game may or may not not deliver, but I gladly give them the benefit of the doubt until I see otherwise. Judging them on their work based on native rendering resolutions is a bit too crazy for me. I think I even saw mentions of Ryse being 900p, so there's no reason for this game to not be much higher, totally ignoring just how unbelievably different the two games are.

I can see where your enthusiasm lies at this game. Until we get the final product, we are unsure of their delivery and quality. As for the 1080p/60fps issue, your concerns may be correct if it was just a simple design choice. But it isn't. It's a cross-gen issue. Their "vision", no matter how ambitious, must fit to the design which the platform is based on - in this case - the "last" gen consoles. This means, any so-called feature that could-have-been is completely nullified since the current-gen consoles (like the PC) are merely up-ports. So if you want to know where all potential for this game could be lost, it would not be in the focus of the resolution or frame-rate this gen...

it would be simply based on how capable the 360/PS3 is. Which means, you'll be judging the game based on that criteria and any faults you may find can only be concluded as such.
 
70 pages about frames per sec? At what point do you realize you've been discussing nonsense. The only question that should matter is if the game is fun or not. This is the same crowd that plays and loves minecraft, and other 8 to 16 bit graphics games and you never once cared about frames per sec

So why post in the thread?
 

Papercuts

fired zero bullets in the orphanage.
I think at least part of the issue is 720p is associated with last gen, so MS wants to avoid that at all costs.

Titanfall ("Frame rate is king") went with 792p instead of 720p and the result was a game that tears frames and drops from 60 FPS on a regular basis.

If this was the case then I don't see why there are 5 games running at 720p on the console.
 

Demon Ice

Banned
If this was the case then I don't see why there are 5 games running at 720p on the console.

Did any of those 5 come out as 720p after TitanFall? Genuinely curious because I don't know, I just know that MS wasn't really expecting Resolutiongate to become such a big deal.


And I honestly can't think of any other reason why Respawn would choose to set Titanfall to a resolution that clearly fails to meet their 60 FPS constant goal.
 

MikahR

Banned
I don't have anything else to refute what you are saying, but until proven otherwise I think ill go with what Dark provided.

Do you really honestly think two developers would have gone with 792p as the native resolution for their respective games unless there was a perfectly valid reason for doing so?

These devs aren't dumb. I'm surprised that quite a few people here in the thread seem to think they are, and that they'd just go with an odd resolution, willy-nilly, just so it wouldn't be 720p.
 

Papercuts

fired zero bullets in the orphanage.
Did any of those 5 come out as 720p after TitanFall? Genuinely curious because I don't know, I just know that MS wasn't really expecting Resolutiongate to become such a big deal.


And I honestly can't think of any other reason why Respawn would choose to set Titanfall to a resolution that clearly fails to meet their 60 FPS constant goal.

MGS was the week after at 720. I don't think there have really been any other releases in general since then.

I do think the Titanfall thing is weird, but they also said they were going to patch in improvements(which is also weird in itself) and so far nothing has happened there. So I dunno what the hell happened.
 

Demon Ice

Banned
MGS was the week after at 720. I don't think there have really been any other releases in general since then.

I do think the Titanfall thing is weird, but they also said they were going to patch in improvements(which is also weird in itself) and so far nothing has happened there. So I dunno what the hell happened.

Hm. Well maybe TitanFall is just the dev being rushed to meet a release date. We'll see what happens if/when the optimization patch drops.
 

Roo

Member
If they have this resolution on consoles that are significantly more powerful... I don't even want to know what will happen to the Wii U version.
 
scalingcomparison100x7fsz.png

scalingcomparison8kecj.png


I scaled my avatar down 66% and back up to the same size, and I scaled it down to 73% and I scaled it back up to the same size using a very dirty scale. You can clearly see more detail is lost in the image on the left than the one in the middle.
You can't do the test this way. You're scaling in both directions, which accentuates the blurring effects and swamps any potential anamorphisms. What you need are three unscaled, native images that are 720p, 792p, and 1080p, and then scale up the smaller images once to match.

I actually agree that dark10x was overstating the case for scaling inaccuracies with odd ratios. But his test was well-structured, within its limits. His fault isn't using an unrepresentative source; the effects should be the same no matter how stark or variegated the image may be. Rather, his conclusion is too strong because of the algorithm he chose. Nearest-neighbor scaling is the least capable of interpolating odd ratios. (By definition, basically--it's the sharpest scaling method, which is another way of saying it averages values the least.) He's right that it will throw up anomalies, as will bilinear to a lesser extent. But bicubic so aggressively smashes contrast that any anamorphisms disappear into the haze (as seen in your images, where all sorts of detail just evaporates, in both versions).
 
To the people who think Ubi will make the PC version look and run like the initial reveal trailer, I ask you: How much did the PC version of Far Cry 3 look like it's reveal trailer?

I don't know how much it looked like the initial reveal trailer, as I don't remember catching it, but it looked amazing regardless. Which, by the way, I expect Watch_Dogs to look, too.

I was bouncing between the PC and PS4 versions, but ultimately decided to go with PC. Cheaper price, and my PC is powerful enough to run it in high settings. Guess my PS4 will be collecting dust until the exclusives hit.
 
70 pages about frames per sec? At what point do you realize you've been discussing nonsense. The only question that should matter is if the game is fun or not. This is the same crowd that plays and loves minecraft, and other 8 to 16 bit graphics games and you never once cared about frames per sec

I'd wager that about only 10 of the pages has been about fps.
Anyway, don't come into a tech thread bitching about tech threads. It makes you look like a whinging fool.
 
M°°nblade;112043119 said:
And how would you know? Last gen games weren't all native 720p either.

It's perfectly possible for 900p to become the 'subHD' of current generation.
Possible, yes. But thus far it seems 1080p will be the resolution for PS4. Killzone is, infamous is, ac4 is, MGS ground zeroes is, tombraider de is, CoD, NFS, knack, etc. And some are allready said to be 1080p (but we'll have to see).
But 900p with great effects and good AA could also be a great alternative imo.
 
All these comments from peeps in the beta have me worried haha. Originall,y I was disappointed by the console performance announcement but obviously, no problem settling with the PC version, but now it's sounding like the game might not even be worth launch day price on PC. The original reveal was fantastic too.
 

kriskrosbbk

Member
All these comments from peeps in the beta have me worried haha. Originall,y I was disappointed by the console performance announcement but obviously, no problem settling with the PC version, but now it's sounding like the game might not even be worth launch day price on PC. The original reveal was fantastic too.

I am trying to follow up the thread but which comments are you refering to?

edit: This is taken from a watch dogs forum :

2 things - it's not fake, but it was a CLOSED beta last night

My friend was on it as I was on the Xbox one closed beta the other weekend

The betas are being managed by VMC Home testing and I have posted a link in another thread (find them on Facebook - the page shows the testing schedule also)

In the email we received prior to testing it stated the graphics were dumbed down for the purpose of testing - wether this is true remains to be seen

He confirmed there was little difference between the Xbox and ps (he has both consoles so was on both betas - I only have an Xbox) - apart from slightly less jaggies and lighting effects maybe a little better on ps4

http://watchdogsforum.net/viewtopic.php?f=13&t=2434

It does make sense,most of the betas are indeed toned down from the final release
 

Bgamer90

Banned
you think people should just accept this? its not acceptable at all... if people decided with their wallets, maybe devs would take a little extra time to think: "Hey, should we release this game @ 900p or work it a bit to get 1080?"

Id bet if people just said, "no, this is unacceptable." and waited to purchase watchdogs instead of running out and settling for it on launch day... the devs would see the sales and try to fix it by possibly giving it a patch.

By "accept" I simply meant to not expect a 1080p patch at launch for this game. Better to be pleasantly surprised than to be disappointed even more.
 
Top Bottom