• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Sunset Overdrive runs at 30 fps/900p

Status
Not open for further replies.

p3tran

Banned
I plan on getting one when Halo 5 comes out............
..................

I never said I dont recognize resolution differences. I said that processing beats pixelcount.

and that the most impressive looking game on your 1080p tv right now, still is ryse, a game with render pipeline of 1600x900.

thats what I said! do you have anything to say on the above?

a dude was asking if his tv that is 1920x1080 would look shit playing 1600x900 rendered games, remember?


oh, and by the way, how on earth would you get mad about a resolution for a game on a console you dont even have?
oh I see... its "expected" you say, thats why you dont get mad...
took me a while to figure it out......
 

demolitio

Member
I never said I dont recognize resolution differences. I said that processing beats pixelcount.

and that the most impressive looking game on your 1080p tv right now, still is ryse, a game with render pipeline of 1600x900.

thats what I said! do you have anything to say on the above?

a dude was asking if his tv that is 1920x1080 would look shit playing 1600x900 rendered games, remember?


oh, and by the way, how on earth would you get mad about a resolution for a game on a console you dont even have?
oh I see... its "expected" you say, thats why you dont get mad...
took me a while to figure it out......

No, he was asking if there was that big of a difference between 1080p and 900p thanks to the quote button and reading. Much like you've done with my posts, you've taken something else out of context.

And once again, I said I don't care about it and I plan on getting one for Halo 5. If you don't believe me, fine, but you sure are angry yourself and making argument up irrelevant to the part I was even talking about. I entered this thread wondering how the framerate held up, saw that question and your non-response, and responded. That's it. If you want to argue everything else, find someone else since having a conversation with you is seemingly impossible at the moment.

I like SO, I want SO. I will get SO someday when I can. I like their games. I would prefer 1080p but understand. You would know this if you would read instead of going into rage mode over something so stupid.

And I said your point about Ryse is subjective and has nothing to do with that question but you keep repeating it because apparently I can't remember anything despite you wanting to debate something I don't care about.

As SgtCobra said, you seem to care. I just shed some light on your non-answer to someone else's question and tried to answer said question. Remember?

Good night, hope you cool off.
 

demolitio

Member
Oh no! After playing NHL 15 I never want to see a 720 game again. It's just a horrible, blurry mess.

That is the cut-off to me as far as the X1 and PS4. 900p I don't mind if need-be even if I prefer native resolution, but 720p is way too annoying and it was something that bugged me on the 360 and PS3 that made me eager for new hardware.

In the end, it's the framerate I've been worried about with SO and what I've been curious about.
 
That is the cut-off to me as far as the X1 and PS4. 900p I don't mind if need-be even if I prefer native resolution, but 720p is way too annoying and it was something that bugged me on the 360 and PS3 that made me eager for new hardware.

In the end, it's the framerate I've been worried about with SO and what I've been curious about.

I also dont support devs that can't reach atleast 900p on the Xbox 1. Unless Halo 5 im such a sucker for halo :'(
 
Oh no! After playing NHL 15 I never want to see a 720 game again. It's just a horrible, blurry mess.

It doesn't really bother me if I play them on my 32" 720p bedroom tv. :p
But I expected this to be really fast pace arcade like gameplay, guess I should check some more footage. The vibrant colors might have mislead me.
 

artist

Banned
A quick count on the list gives me

20 1080p
20 under 1080p

If we remove TR totally because it's 1080 and 900.

So no it is not the majority or most.

Whereas PS4 is at 43 1080p to 4 under 1080, yet it often gets pulled into discussion about current consoles not hitting 1080?
Did you count Indies? Indies only count when res/fps is concerned.
 

AmFreak

Member
All that would have done is raise prices for those people outside the US even more. So what would that prove?

That this statement
"And nobody is buying a games console for 500 dollars let alone 600 when their relevance to the core gaming market is in flux and hitting arbitrary numbers being the least impressive factors for casual buyers"
is already proven false and is ridiculous.
You don't go from hero to zero cause of 100$.
 

Gestault

Member
uh, you dont read the meta data of a video file. That will always say the framerate of the video encoding.

When framerate drops you will see duplicate for multiple frames of that 30. That is pretty obvious to see at multiple points in the video.

LOL. Frame rate counter software either count the torn screens or duplicate images. You don't get anything but the encoding frame rate from meta data.

Hey, at least we now know Arron Greenberg posts on GAF.

I wasn't referring to the encoded framerate, I was talking about expected vs discrete frame data. You can run a record overlay and look for frame delays that stand out. And I did, for this video that cgcg claimed ran at 20-25 range at all times. I tested it on my end, confirmed that they were full of it, and asked for substantiation from them. I ran a capture at 33 milliseconds. The frame delay was basically a constant .3 seconds, with the intermittent carryover from the .03 additional. It's possible I'm overlooking something, but I have a tendency to trust my eyes, especially in combination with an analysis tool.

So from everything I can confirm, both from the appearance of it, and from the tools I'm using here, even this earlier build is running at a stable enough 30, and someone made something up about the video, didn't think substantiating it was important, and a group took it at face value.
 

btags

Member
I wasn't referring to the encoded framerate, I was talking about expected vs discrete frame data. You can run a record overlay and look for frame delays that stand out. And I did, for this video that cgcg claimed ran at 20-25 range at all times. I tested it on my end, confirmed that they were full of it, and asked for substantiation from them. I ran a capture at 33 milliseconds. The frame delay was basically a constant .3 seconds, with the intermittent carryover from the .03 additional. It's possible I'm overlooking something, but I have a tendency to trust my eyes, especially in combination with an analysis tool.

So from everything I can confirm, both from the appearance of it, and from the tools I'm using here, even this earlier build is running at a stable enough 30, and someone made something up about the video, didn't think substantiating it was important, and a group took it at face value.

I am curious, were there any drops in your analysis and if so when? I am really looking forward to this game and hoping it maintains a constant framerate.
 

Gestault

Member
I am curious, were there any drops in your analysis and if so when? I am really looking forward to this game and hoping it maintains a constant framerate.

Lemmie see if I can get some timecodes for you. Be back in a little while. Just to be clear, this is really rudimentary. It's literally me looking through a stream of reported times for each frame, but as far as I understand (and have seen in the past), it's an accurate method.

Edit: So far, it's been very solid. I had a false positive until I realized the respawn animation fading in from black counted as delayed frames (because it was a black screen). I'm basically spot-checking at this point.

Edit 2: Technically at 2:42 there was something? It wasn't really visible when I rewatched it at speed. What I've looked over is solid enough that until someone comes along with better tools, I'd say anyone saying *that* particular video has framerate problems is way off-base. I'd have hoped that would be self evident just from watching it.

Edit 3: On a whim, I decided to check the most obviously performance intensive segments. Nothing crazy, but for example, the baddie-spigot tanker moment at 4:31 technically dipped to between 28 and 30 for part of a second. That's the extent of what I can pull from it. Not locked, but not what I'd call drops.
 

cakely

Member
you know what? WHO CARES?
last generation, THE RULE was that ps3 got the inferior and/or broken ports of practically EVERY SINGLE GAME,
yet a lot of ps3 owners seemed quite happy. no?

I guess ... you care? And judging by your use of all caps, you seem to care quite a bit.

I like your tag, though.
 

shandy706

Member
Someone mentioned Halo.

I really hope they go 60fps, 900p, and tons of effects/epic, large battlegrounds.

I might even be ok with 30fps LOCKED campaign with Ryse's level of graphics.
 

longdi

Banned
I dont quite get the appeal of this game, nevermind the poor 900p/30fps. All the video plays just look mindless, messy and repetitive. I get people like the new-ness and wacky nature of it, but IG's games were rather bland and always a 7.5/10 at best, solid but not a blockbuster. IG also are known for their weak gunplay and AI, which makes sorta sense they would do a game like SO.
 

buckeye13

Banned
Its ashame, I own both ps4 and xb1, the xb1 is supposed to be the center of your living room. Most people have their best tv there. I have my 55" sony w900 which is an amazing gaming tv. But I have moved my ps4 into the living room, and my xb1 to my den gaming room, as this is a 32" 1080p tv. Lower rez games do look better on a smaller screen. Was really hoping with dumping the kinect, that games would atleast match resolution, even if effects had to be tamed down. Doesnt mean this game will be bad, but disappointed at 900p
 

Dash Kappei

Not actually that important
Its ashame, I own both ps4 and xb1, the xb1 is supposed to be the center of your living room. Most people have their best tv there. I have my 55" sony w900 which is an amazing gaming tv. But I have moved my ps4 into the living room, and my xb1 to my den gaming room, as this is a 32" 1080p tv. Lower rez games do look better on a smaller screen. Was really hoping with dumping the kinect, that games would atleast match resolution, even if effects had to be tamed down. Doesnt mean this game will be bad, but disappointed at 900p

But that is exactly what they decided against, it's a design decision. There's no ps4 version at 1080p to speak of, so you going off that tangent doesn't make sense, they've decided to push effects as much as 900p would allow, they could've gone with 720p and even more effects or 1080p dialing them back; there was nothing to match here as this is the only version of the game in development.
 

Basketball

Member
How's it run on PS4?

2hDO9Dq.gif
 

Dash Kappei

Not actually that important
What's even sadder, is the fact that positive SO threads barely manage three pages!

Pretty telling really

Oh get off of it dude, don't play that card, I'd hope people are smarter than that. I can assure you if current GAF's lovechild Bloodborne was announced to be 720p at 30fps we'd have a thread in the hundreds, where a similar thread confirming 1080p@60 would barely pass ten pages if that. It has always worked like that in media as far as news reporting and analysis is concerned.
 

shinnn

Member
There's any thread discussing the main subject of the OT article? You know "SUNSET OVERDRIVE DEV: 'IT'S THE BIGGEST GAME WE'VE EVER MADE'"
 

NickFire

Member
This was apparently known for a while, yet this thread is 13 pages anyway.

Considering its a few months post return of gpu resources, during which time MS sent its engineers to Bungie to get Destiny to 1080, I can see why people want to discuss the subpar resolution.

Still wondering why the IGN interviewer did not ask him to clarify why he only referenced the CPU when he explained why removing the Kinect did not let them get to 1080.
 

btags

Member
Lemmie see if I can get some timecodes for you. Be back in a little while. Just to be clear, this is really rudimentary. It's literally me looking through a stream of reported times for each frame, but as far as I understand (and have seen in the past), it's an accurate method.

Edit: So far, it's been very solid. I had a false positive until I realized the respawn animation fading in from black counted as delayed frames (because it was a black screen). I'm basically spot-checking at this point.

Edit 2: Technically at 2:42 there was something? It wasn't really visible when I rewatched it at speed. What I've looked over is solid enough that until someone comes along with better tools, I'd say anyone saying *that* particular video has framerate problems is way off-base. I'd have hoped that would be self evident just from watching it.

Edit 3: On a whim, I decided to check the most obviously performance intensive segments. Nothing crazy, but for example, the baddie-spigot tanker moment at 4:31 technically dipped to between 28 and 30 for part of a second. That's the extent of what I can pull from it. Not locked, but not what I'd call drops.

Thanks for the info, that seems pretty solid.
 

btags

Member
Considering its a few months post return of gpu resources, during which time MS sent its engineers to Bungie to get Destiny to 1080, I can see why people want to discuss the subpar resolution.

Still wondering why the IGN interviewer did not ask him to clarify why he only referenced the CPU when he explained why removing the Kinect did not let them get to 1080.

A 10% jump in gpu performance would not necessarily allow for a resolution increase from 900p to 1080p. Games that underwent this increase (destiny and d3) were most likely already near acceptable performance at 1080p and the small boost allowed them to reach 1080p with more constant framerates.
 

NickFire

Member
A 10% jump in gpu performance would not necessarily allow for a resolution increase from 900p to 1080p. Games that underwent this increase (destiny and d3) were most likely already near acceptable performance at 1080p and the small boost allowed them to reach 1080p with more constant framerates.

I tend to agree with your thought process, but you missed my point. My point is what kind of gaming "journalist" hears a developer discuss CPU when explaining why the return of Kinect resources doesn't get the game over the 1080p hurdle, and not ask him any follow-up about the GPU? Seems to me the developer was given a convenient way to avoid saying the GPU time sliced that was returned really doesn't amount for much.
 
Status
Not open for further replies.
Top Bottom