• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Ryse Confirmed 900p, was always 900p...

I know its DOF does, as well as the particles emitted from the drone thing when it does the "stun" move.

They said that they will try to move DpF to compute shaders, but CryEngine does DoF without compute shaders and actually have Bokeh on current gen consoles.
Particles are doable on CPU, they are just more expensive and in their tech presentation they had particles systems locked on CPUs, it could lately though.

Everything is possible without compute shaders, so on current gen consoles, its all about fidelity. I think Resogun is the game that would be severely limited by current-gen tech and not doable without reworking the game's destruction model.
 
Thats not bad, IQ might be just good enough.

Not triying to put salt on the wound or anything, but isn't Second Son 1080p native? and that one looks like it has way better effects and facial animation.

I think you're underestimating imaginarium. Both games have great facial animations. Why do you think infamous has better facial animations? I will do a comparison but i already know the outcome "One is a linear game and the other one is open world."
 
So no downgrade after all.

Seriously folks?
Ryse-Gamers.jpg


Every journalist on the floor played the game at 900p at E3 on screens that close and nobody noticed. I doubt people will notice when they play it on their TVs at home.

Not sure if serious. You really are trying hard to believe that tweet or you just really want to believe the game wasn't running at 1080p at one time on whatever hardware it was on.

So the only native 1080p game is FORZA?
I'll allow myself to laugh at this post... hahaha
 
Compute shaders for what in KZ? Particle system? Didn't they say that the particles system is made by the CPU in the post-mortem? I don't know if i remember correctly.

They said it was on the CPU for the reveal but would probably be moved to GPU compute for the final release.
 
Does it matter when most of the people complaining about it have had no intention of playing it in the first place? Just sayin'.

Also that 1080 - 900 joke was said before on a different thread. Surprised you're not hearing it left and right already.
 
Does it matter when most of the people complaining about it have had no intention of playing it in the first place? Just sayin'.

Also that 1080 - 900 joke was said before on a different thread. Surprised you're not hearing it left and right already.

It's even worse since it also works with 900-720. It will always be "relevant".
 
The 900p one is a bit blurier but it's really insignificant.
Why is it that 900p vs 1080p on BF3 is day and night yet those shots Crysis 3 indicate otherwise? Seriously, 900p makes the game look awful. And my PC is hooked up to a Samsung LCD TV where I sit a few feet away from the screen.

If 900p doesn't look any different from 1080p, then sure, it devs can stick to 900p for all I care. Otherwise, 1080p or bust.
 
It was running in 1080p in direct feed video. Someone proved it here in Neogaf.

The frame buffer is native 1080p... so no kidding it was 1080p direct feed video.

Also for anyone with a native 1080p display.

When everyone thought it was 1080p they didn't question it at all, even people playing it, now that we know it's always been native 900p you guys are calling it ugly.

Most people playing on a 40"-50" from 6'+ away from the TV won't notice a difference at all.

Seriously folks?

Do you have any evidence that it was 1080p native?

We have footage of a kid who played the E3 demo on a Xbox One at home so we know it ran on Xbox One hardware smoothly.

The framebuffer is native 1080p and it's being upscaled so of course it's being shown at 1080p.
 
Why is it that 900p vs 1080p on BF3 is day and night yet those shots Crysis 3 indicate otherwise? Seriously, 900p makes the game look awful. And my PC is hooked up to a Samsung LCD TV where I sit a few feet away from the screen.

If 900p doesn't look any different from 1080p, then sure, it devs can stick to 900p for all I care. Otherwise, 1080p or bust.

It depend of upscaling technique and scene. More details and less post-effects in a scene will make bigger difference, more post-effects, more moving stuff in the scene the difference will be smaller.
 
The frame buffer is native 1080p... so no kidding it was 1080p direct feed video.



When everyone thought it was 1080p they didn't question it at all, even people playing it, now that we know it's always been native 900p you guys are calling it ugly. Go figure sounds like dumb bias to me.

I haven't played Ryse on my 1080p HDTV. But I know I'd prefer it to be 1920x1080 on my 1920x1080 TV so I can have 1:1 pixel mapping and no upscaling. 1600x900 isn't a deal breaker for me, but I am very disappointed when it will have been 8 years by time next-gen hits. I've been playing on a 1080p HDTV since 2006. And I still must play sub 1080p games in 2013 and beyond? Disappointing. Big time.
 
like i said, its not hyperbole to me. we've gone through the same cases this gen where many sony 1st party games have looked half a gen ahead of anything else. some scenes in beyond two souls just boggle the mind. I haven't seen anything on pc that comes close to that kz scene either.

What am I reading right now
 
I haven't played Ryse on my 1080p HDTV. But I know I'd prefer it to be 1920x1080 on my 1920x1080 TV so I can have 1:1 pixel mapping and no upscaling. 1600x900 isn't a deal breaker for me, but I am very disappointed when it will have been 8 years before next-gen hits. I've been playing on a 1080p HDTV since 2006. And I still must play sub 1080p games in 2013 and beyond? Disappointing. Big time.

Exactly so why don't we wait till we have it playing on our HDTVs before shunning it into a grave?

What I meant was "native 1080p".

Just because something is direct feeding video at 1080p doesn't mean it's native 1080p.

What am I reading right now

Killzone looks better than anything on PC? Someones on some good drugs.
 
Here's a better 1080p/900p upscaled comparison from Eurogamer.

900p

1080p
.

There is a noticeable difference, however at least to me, it's fairly subtle and not nearly as bad as that Metro 2033 screenshot.

I don't understand why people care so much though, Ryse looks pretty damn good regardless. I'm primarily a PC gamer and I'm still in awe at how good it looks.

How could anyone tell the difference unless they have 2 TVs side by side?
 
I daresay... that their upscaling and AA is pretty good if most people did not notice this.
SMAA shows its strength once again I guess.
With the horrible compression from youtube it is hard to see this difference. Now if you get a super high bitrate or raw footage the differences become much more visible.
 
Exactly so why don't we wait till we have it playing on our HDTVs before shunning it into a grave?



Just because something is direct feeding video at 1080p doesn't mean it's native 1080p.

I haven't shunned Ryse. I'm bashing Xbox One and PS4 for not being more powerful machines in 2013.
 
How could anyone tell the difference unless they have 2 TVs side by side?

Probably not on day one but later on in the gen when more games are 1080p people will notice. I used to not be able to tell differences between 720p and some sub-HD resolutions but now I see it instantly after getting trained to see what 720p looks like on my set.
 
Not sure if serious. You really are trying hard to believe that tweet or you just really want to believe the game wasn't running at 1080p at one time on whatever hardware it was on.

Just because their's 1080p footage of a game doesn't mean it is 1080p native.
If a game is 1080p native and/or 60fps you can count on that company making those numbers known loud and clear.
In the case of Ryse, Crytek NEVER claimed it would be 1080p native.
 
The frame buffer is native 1080p... so no kidding it was 1080p direct feed video.



When everyone thought it was 1080p they didn't question it at all, even people playing it, now that we know it's always been native 900p you guys are calling it ugly.

Most people playing on a 40"-50" from 6'+ away from the TV won't notice a difference at all.



Do you have any evidence that it was 1080p native?

We have footage of a kid who played the E3 demo on a Xbox One at home so we know it ran on Xbox One hardware smoothly.

The framebuffer is native 1080p and it's being upscaled so of course it's being shown at 1080p.

I not arguing about the looks or smoothness. I saw the leaked UI vid and I liked it. We are talking about being lied to and deceived about what hardware it was running on. Plus CBoat mentioned these things about games being downgraded. 1080 was proven a few pages back from a poster that is linked to another Ryse thread. [Post below mine]

Just because their's 1080p footage of a game doesn't mean it is 1080p native.
If a game is 1080p native and/or 60fps you can count on that company making those numbers known loud and clear.
In the case of Ryse, Crytek NEVER claimed it would be 1080p native.
*sigh* ok
 
I haven't shunned Ryse. I'm bashing Xbox One and PS4 for not being more powerful machines in 2013.

Well of course, neither machine is impressive compared to PC components this time around.

Of course but here http://www.neogaf.com/forum/showpost.php?p=82450885&postcount=8 Liabe Brave did pixel counting and deduced that video was native 1080p.

Doesn't the framebuffer being at 1080p native hurt this kind of pixel counting?

As far as Cboat is concerned where is the downgrade of DR3? Which has only performed better and better since we've seen it? RYSE also only looks better since the recent optimization and the only official word on it being 1080p was Aaron Greenberg who later changed that statement to 900p.
 
like i said, its not hyperbole to me. we've gone through the same cases this gen where many sony 1st party games have looked half a gen ahead of anything else. some scenes in beyond two souls just boggle the mind. I haven't seen anything on pc that comes close to that kz scene either.

What am I reading right now
 
According to Eurogamer Microsoft did call it a native 1080p game at the E3 though.

Not uncommon, Polyphony did the same for GT5/GT6. The point is it may as well be 1080p as that's already been proven, since no-one can tell the difference unless told.

Anyone would be forgiven for thinking resolution is the most important aspect to a game's graphics judging by some of the response to this news. It goes without saying (but it's worth reiterating) that Crytek are obviously trying to push the very best graphics they can and the resolution drop from 1920 x 1080 to 1600 x 900 is obviously worth the extra effects they can put in the game. The AA for instance already speaks for itself, which is far more desirable than a native 1080p image with jaggies.
 
I haven't shunned Ryse. I'm bashing Xbox One and PS4 for not being more powerful machines in 2013.


NEVER judge a system on it's launch titles.
Like i said earlier in this thread, EVERY SINGLE ONE OF THESE LAUNCH GAMES on both systems would be 1080p/60fps if the developers had more time to work on them.
 
Intresting conversation if it did take place....

"Yes everything we showed at E3 and Gamescom was on kits. From my point of view the development is coming along well and with the optimizations we have put in, I am pretty confident the final game will look better than what we showed at E3. I even had to open the cupboard to prove it to some guys from Guerilla who didn’t believe me. At the demo stations our kits were on top of the cupboard directly connected to the screens, so no way we could have done anything else other than run on kits.”

Nick Button-Brown, the general manager of games for Crytek

http://www.cinemablend.com/games/Crytek-Ryse-Look-Better-Xbox-One-Than-It-Did-E3-59478.html

Boom shakalaka
 
Doesn't the framebuffer being at 1080p native hurt this kind of pixel counting?

It doesn't matter, as long they're upscaling from 900p to 1080p, native 1080p won't mean anything.

Come on man. This is literally the first line in that post:

Since I'm new to all this, I didn't want to just start with me doing pixel counting as if I was some sort of expert.

It could be, but he has done other test in that thread and they all prove that his method works.
 
Can SMAA be brute forced on any game?

I thought it was a type of post-rendering AA (similar to FXAA but better), correct?

Not uncommon, Polyphony did the same for GT5/GT6. The point is it may as well be 1080p as that's already been proven, since no-one can tell the difference unless told.

Anyone would be forgiven for thinking resolution is the most important aspect to a game's graphics judging by some of the response to this news. It goes without saying (but it's worth reiterating) that Crytek are obviously trying to push the very best graphics they can and the resolution drop from 1920 x 1080 to 1600 x 900 is obviously worth the extra effects they can put in the game. The AA for instance already speaks for itself, which is far more preferable than a native 1080p image with jaggies.

Can I get an Amen?
 
Boom shakalaka

The leaked footage by the kid showed the E3 demo running on Xbox One hardware.

It doesn't matter, as long they're upscaling from 900p to 1080p native 1080p won't mean anything.

It could mean a lot actually, if it has a good scaler they could mostly hide the difference in upscaling the content. Of course native 1080p is always going to be sharper but it Crytek decided that going with 900p with a 1080 framebuffer was the way to go and managed to fool thousands of people into thinking it was native 1080p as they play feet away from a 40" screen I think we should give them a chance.

We might see a future of this gen where 900p with great scaling is a ongoing trend.
 
Boom shakalaka

...weren't you the guy who just said this?

Not sure if serious. You really are trying hard to believe that tweet or you just really want to believe the game wasn't running at 1080p at one time on whatever hardware it was on.

The post you are quoting demonstrates that it was in fact running at 900p. It was running on actual Xbones (dev kits or otherwise; nevertheless, not high-end PCs), and people did not seem to notice. It couldn't have been running at 1080p
 
Intresting conversation if it did take place....

"Yes everything we showed at E3 and Gamescom was on kits. From my point of view the development is coming along well and with the optimizations we have put in, I am pretty confident the final game will look better than what we showed at E3. I even had to open the cupboard to prove it to some guys from Guerilla who didn’t believe me. At the demo stations our kits were on top of the cupboard directly connected to the screens, so no way we could have done anything else other than run on kits.”

Nick Button-Brown, the general manager of games for Crytek

http://www.cinemablend.com/games/Crytek-Ryse-Look-Better-Xbox-One-Than-It-Did-E3-59478.html

Haha that's brilliant. So many people I speak to don't believe Ryse was running on X1 at E3. That must be a real confidence booster when rival devs don't believe you (and pretty satisfying when you prove them otherwise).
 
Yes, it's a post-process AA solution. However, Crysis 3 is the only game which seems to support it.

Go to the high res screenshot thread (2013) here on GAF. A lot of those people have put it on using SweetFX.

Viva Pinata, for example, can be run with SMAA forced, and it works pretty nicely.
 
Nottodisushittuagain.jpg

This is a RYSE thread, dammit. Talk about Ryse. For example: the character models look great even with the poly reduction. I can't tell the difference. Conversely, I am concerned that the non native res that the game will run in will negatively impact my experience.

*Puts fingers in ears* LALALALALA why would we discuss Ryse in a Ryse thread? It's clearly a comparison thread. Dummy. /s

I've seen character models with 40k triangles that blew me away. The reduction in polygons for the MC is a non-issue for me, and I struggle to see a visible difference in quality for his model after the "downgrade."
 
It could mean a lot actually, if it has a good scaler they could mostly hide the difference in upscaling the content. Of course native 1080p is always going to be sharper but it Crytek decided that going with 900p with a 1080 framebuffer was the way to go and managed to fool thousands of people into thinking it was native 1080p as they play feet away from a 40" screen I think we should give them a chance.

We might see a future of this gen where 900p with great scaling is a ongoing trend.

If they created a upscaler that good, well kudos to them.
 
Haha that's brilliant. So many people I speak to don't believe Ryse was running on X1 at E3. That must be a real confidence booster when rival devs don't believe you (and pretty satisfying when you prove them otherwise).

Indeed. For a launch title it does look amazing, can't wait to see what Remedy will bring to the table with Quantum Break graphically, not to mention 343i.
 
Go to the high res screenshot thread (2013) here on GAF. A lot of those people have put it on using SweetFX.

Viva Pinata, for example, can be run with SMAA forced, and it works pretty nicely.

Viva Pinata in 1080p... I want you so.

If they created a upscaler that good, well kudos to them.

Exactly what I want to see also, I definitely have nothing to back it up but I'm interested in seeing how good scaling becomes on these consoles.

There is something to be said about being able to lower resolution without much loss but freeing up a lot of resources for more stuff on screen.

It still does and they even made it look better. Keep up Sir.

I haven't seen anything in KZ that matches the character models and facial animations in RYSE and they now look much better in RYSE than before.

To be fair KZ doesn't really need it but if you're going to compare the games don't ignore what RYSE is doing in close range of the character. You have 7-8 enemies in your face surrounding you all at 85k triangles, 770 facial joints and full facial animations while you have even more AI fighting around you, ships landing behind you, fires blazing everywhere and smoke everywhere.

KZ is focusing much more on effects and environmental details like they should for the type of FPS they are making.
 
Top Bottom