• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Ryse Confirmed 900p, was always 900p...

This is a gaming forum, where people like to talk about all aspects of gaming before, during, and after games release or are announced or have decent footage. It's normal, otherwise there's less to talk about.

Of course, but when people are attempting to 'analyze' differences between builds and ignore the fact that one screenshot is direct feed, and the other extremely compressed stream footage it gets a bit silly. Especially when talking about things like depth of field. There's discussion and then there's people pretending to know more than they do. I certainly welcome all technical discussion from people in the know of such things, I get to learn things!
 
It was proved it was native 1080p.

By pixel counting? Doesn't he tweet say they use their own upscaler and send to the frame buffer at 1080p?

upscaleszsz1.png


These were drawn in 1600x900 and upscaled to 1090x1080, how is pixel counting going to help determine that is isn't 1080 originally? I genuinely curious since it seems they are using the upscale as a form of AA which is exactly what happens when I upscale my original.
 
DriveClub is currently running at 30fps/1080p to allow for more detail. This is no different.Unless you think a higher resolution is better than twice the frame-rate.
Has Sony been dishonest about DriveClub's framerate? The discussion is mainly about the fact most people here believe Ryse was at one point 1080p because Crytek actually showed 1080p stuff.
 
Of course I read the title, but Incompetent Microsoft are the publisher, so it'd be unreasonable to assume they aren't lying.

Microsoft (Greenberg) are the ones that clarified the resolution first. But don't let that get in the way of your console war.
 
By pixel counting? Doesn't he tweet say they use their own upscaler and send to the frame buffer at 1080p?

upscaleszsz1.png


These were drawn in 1600x900 and upscaled to 1090x1080, how is pixel counting going to help determine that is isn't 1080 originally? I genuinely curious since it seems they are using the upscale as a form of AA which is exactly what happens when I upscale my original.

depending upon the AA method you could see it through shading stepping on edges (sub pixel etc...).
 
The thing is... not everything runs at native res in a renderer (even in ps4 1080p games).

Only game I can think of that ever did that was the first metro (the game scales basically linearly with hardware as a result).

A lot of PS4 and PC games have their buffers rendering at 1/2 or 1/4 res. For example, GG has the volumetrics and depth of field render at quarter res for performance reasons.

That's true, but usually when we talk about the 'native res' of the game we're talking about the res of the main color/lighting buffers. Or in a deferred renderer, the normal/material/lighting buffers or whatever. Yes, some particle and camera effects use quarter or half res buffers in some games, but I think most people define the 'native res' by the other main buffers.
 
BF3 runs 720p on 360 and looks good on my 1080p 50" Panny Plasma. Unplayable after playing it 60fps with 64 players on my computer, but that's a different story.

Well it looks like crap on my Samsung 55 LED tv, when I play it on my PS3.

So bad in fact I play it on my 24" 1080p monitor on my PC instead and at 60fps too.
 

You know what's hilarious about this part?

(Note that the screenshot says "This does not represent final game quality" right on it.)

Inherently, when you see that note, you immediately think it means that what you're seeing looks worse than how the final product will look -- i.e. that the message is a "we still have to polish some stuff" disclaimer -- but in this case, it's the complete opposite.
 
Every journalist on the floor played the game at 900p at E3 on screens that close and nobody noticed. I doubt people will notice when they play it on their TVs at home.

Do journalists even know what 1080p is? Like, do we have confirmation on that? Because they all said GTA 5 had no framerate drops and no control issues at all.
 
Of course, but when people are attempting to 'analyze' differences between builds and ignore the fact that one screenshot is direct feed, and the other extremely compressed stream footage it gets a bit silly. Especially when talking about things like depth of field. There's discussion and then there's people pretending to know more than they do. I certainly welcome all technical discussion from people in the know of such things, I get to learn things!

All I said was one looked blurrier than the other. You didn't see that technical jargon coming out of my posts. Now I know one is from a stream, there's 0 drama or problems here.
 
Has Sony been dishonest about DriveClub's framerate? The discussion is mainly about the fact most people here believe Ryse was at one point 1080p because Crytek actually showed 1080p stuff.

The discussion is whether 900p is noticeably different from 1080p, clearly it's not.
 
All I said was one looked blurrier than the other. You didn't see that technical jargon coming out of my posts. Now I know one is from a stream, there's 0 drama or problems here.

I wasn't talking about you, meant to edit my post to clarify that. Was speaking in general, sorry for the misunderstanding. Really didn't mean to insult you.
 
I'm confused. Who actually confirmed that the games shown at E3 were running on XB1 retail units? Digital Foundry?

It seems to me that what was shown at E3; the graphically impressive Forza 5 and Ryse demo most pointedly, was running on XB1 devkits.

And it seems plausible that what has happened since is that devs have had to port code properly over to XB1 hardware/software and are in the optimisation process, and perhaps realising that, especially with the console's apparently poor graphics drivers atm, they can't hit the same level of perf as code running on devkits.

This is why we are seeing Ryse scaled back from the eye-candy at E3. And why F5's latest demo doesn't look anyway near as good as that stage shown at E3 that got race fans excited.
 
The discussion is whether 900p is noticeably different from 1080p, clearly it's not.
Oh, CLEARLY. Guess we should all go buy 720 TVs... they're cheaper! /s

The ability to see that a game is running in non native res on your display is entirely based on your viewing conditions. If I am looking for it, I will see it. Well it impact me during gameplay? Maybe, maybe not. If it's well upscaled, it is much less noticeable. But to say it isn't noticeable at all is a joke. Play games on a 65" display and tell me you don't notice that kind of thing.
 
It was running in 1080p in direct feed video. Someone proved it here in Neogaf.

Up scaled, native 900p. Ryse has played on the hardware since E3, and the hardware hasn't got any less powerful.

Some people assumed it was 1080p because it looks good, but 900p does look good.
 
The discussion is whether 900p is noticeably different from 1080p, clearly it's not.

It is noticeably different. What you mean to say is that 900p doesn't make the game look like crap. If you are given the option in the graphics menu, and there's no performance drop you will 100% of the time choose 1080p, because it looks better though it's not a world of difference. And sure enough, if you don't have the option to see the differences you won't notice them...

What's worrying though, or not, is that out of 3 first party MS games only 1 is 1080p.
 
depending upon the AA method you could see it through shading stepping on edges (sub pixel etc...).

Yeah but the guy counted pixels in this image:

Kh3IYlE.png


Here is my upscale at 400%:

upscale2iju5q.png


I'm not seeing how his count is conclusive since the output resolution IS 1080p.
 
I'm worried that If they start compromising the resolution now, they will continue to do so later on during the console's lifetime just because of pushing graphics. Just do what you can with 1080p.
 
I'm confused. Who actually confirmed that the games shown at E3 were running on XB1 retail units? Digital Foundry?

It seems to me that what was shown at E3; the graphically impressive Forza 5 and Ryse demo most pointedly, was running on XB1 devkits.

And it seems plausible that what has happened since is that devs have had to port code properly over to XB1 hardware/software and are in the optimisation process, and perhaps realising that, especially with the console's apparently poor graphics drivers atm, they can't hit the same level of perf as code running on devkits.

This is why we are seeing Ryse scaled back from the eye-candy at E3. And why F5's latest demo doesn't look anyway near as good as that stage shown at E3 that got race fans excited.

I thought the same thing, but Albert Penello clarified that the dev kits have the exact same performance as retail units, so that can't be the case. I'd look for the quote, but I'm on my ipad so I guess you're gonna have to take my word for it.
 
It's misleading to say "Full HD" instead of the actual resolution. It was a dead giveaway though since if it was running at 1080p he would have proudly said so.

That was my impression also. When he used vague language it was obvious to me it was not full 1080p because he would have stated that, as you said so well, proudly. But, because of his vague language and the use of "full HD" many translated that as 1080p. It's good he retweeted a response to clarify.
I did not know there were varying degrees of HD. I honestly thought anything above 720p was the "Full HD" experience. You learn something new everyday.
 
I'm confused. Who actually confirmed that the games shown at E3 were running on XB1 retail units? Digital Foundry?

It seems to me that what was shown at E3; the graphically impressive Forza 5 and Ryse demo most pointedly, was running on XB1 devkits.

And it seems plausible that what has happened since is that devs have had to port code properly over to XB1 hardware/software and are in the optimisation process, and perhaps realising that, especially with the console's apparently poor graphics drivers atm, they can't hit the same level of perf as code running on devkits.

This is why we are seeing Ryse scaled back from the eye-candy at E3. And why F5's latest demo doesn't look anyway near as good as that stage shown at E3 that got race fans excited.
They were on dev kits like every game ever presented before launch. All PS4 games are also presented on dev-kits and no, devkits do not have better hardware.
 
I'm still unsure as to why it matters ? Whether Ryse is a fun game remains to be seen but it is very nice looking graphically, even if it were stuck at 720p it would still be showcasing graphics well beyond even the best looking 360/ps3 games currently available. I'm talking from a purely technical stand point here too, appreciation of the selected art style is entirely a subjective thing so it's not even worth mentioning.

I found it equally ridiculous that people were complaining about an apparent poly count downgrade last week because instead of 120,000 it had gone down to around 80,000 polys on the main character. The evidence apparently being a single slide at a trade show. I doubt even 1 person(outside of crytek) would be able to pinpoint where those polys got removed if they were actually gone in the first place and for the record even 80,000 polys is 3 times what most game characters used on this current crop of consoles(outside of fighting games) so it's still a technical achievement.

As I mentioned in another similar thread before - these are launch games. 720p to 1080p is almost as big a resolution jump as SD to HD was last generation. Then you want to obviously quadruple texture detail, have more things on screen , direct X 11 shaders (and lighting and shadowing in DX11 are VERY demanding, especially SSAO) and to top it all off instead of 100,000 polygons per frame , probably aiming at 500,000 to 1,000,000 polys per frame.

Last generation went on for too long , many decided to buy a gaming pc in the last 3 years and they got used to gaming at 1080p/60fps so now anything less is a disappointment ... until you realize these boxes cost less then a 3rd of what that nice gaming rig did.
 
I have a "I can take it or leave it" attitude towards Ryse but what worries me is what it says about CryTek and the future Crysis 4 which I had hoped would be a return to Crysis 1 now that the console concessions shouldn't be so great with next-gen RAM amounts.
 
Top Bottom