• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Ryse Confirmed 900p, was always 900p...

It's not about Ryse looking good or bad. Or about people being able to notice 900p vs 1080p. The reason people are paying close attention to every little detail is simply because it confirms the rumors about the power gap between the consoles. The fact that only Forza is running at 1080p says a lot.

You are way off the mark.

Are you suggesting a poorer (visually) looking game that ran at 1080p would mean the power gap was no longer present?

Does DriveClub running at 30fps suggest a lack of power? Honest question.
 
Can someone tweet about the native resolution of the vidoc in Crytek site? If the game will have that IQ then nobody need to worry about. But i doubt it of course.
 
You are way off the mark.

Are you suggesting a poorer (visually) looking game that ran at 1080p would mean the power gap was no longer present?

I think he's suggesting that if they could get it to look like the new shots at 1080p native it would be telling of the power within the box.
 
Comparing E3 to the latest build, I think the game looks a lot better now. It's seemed to me for a while that XBO was more designed to target 1600x900 instead of full 1080p. Every since we got the leak about the upscaler and the display planes which can be used to maintain a full HD HUD, negating one of the more obvious issues with scaling, it seemed like that was the approach MS was going for.

For me right now it's a toss up between this and Killzone for best looking launch games. Both seem way ahead of everything else.
 
Not gonna flame you, but I wouldn't call the difference between 1080p and 900p "laughable." It's a difference of 30% in the amount of pixels rendered. 30% is a pretty significant reduction.

Stats are nice, but what does that mean in real world terms for the console gamer that plays normally and sits away from their TV?

It's basically nothing.
 
Not gonna flame you, but I wouldn't call the difference between 1080p and 900p "laughable." It's a difference of 30% in the amount of pixels rendered. 30% is a pretty significant reduction.

Not if you end up with better image quality. Avatar at 720p is going to look better than any game at 1080p.
 
Lol.. I could see no difference in those 2 pictures.

Oh well. The games should be fun regardless.
For what it's worth, I could. That doesn't mean it's a sticking point for whether or not I purchase the game, but the difference exists and the other merits of the game are besides the point.

While you certainly have a point about the game's quality not wholly depending on the resolution, that's not what this thread is about. I have no problem with someone enjoying a game, but things to the effect of "resolution doesn't matter" (not a direct quote of you, obviously. Just a paraphrase of several similarly-themed posts) are off topic in these threads.
 
Stats are nice, but what does that mean in real world terms for the console gamer that plays normally and sits away from their TV?

It's basically nothing.

Truth,

Most (people in general) couldn't even pass a blind 720p vs 1080p "blind test" (I know) let alone a 900p vs 1080p comparison.

I mean the game was apparently 900p the whole time. If they had never told us it was 900p, the airwaves would have been dead quiet.
 
framebuffer=internal resolution=tweet doesn't make sense. so he is basically contradicting himself by saying that the game is 1600x900 but proceed to say that the framebuffer is 1080p. back at square one.

Makes perfect sense. The final framebuffer can be at a certain res, and have some elements rendered at native res and others at lower res.

Usually particles are rendered at lower resolution, but there's nothing that prevents rendering geometry for example. If there's no intermediate scaler you'd have to upscale the intermediate buffers in software, though.
 
More detailed yes, that comes with the additional pixels. Some people don't care much for it though, but obviously this discussion here is happening because some do. :P

If the game runs at 900p it means that with the same settings it can't run at a full 1080p.

So no, the game at 1080 wouldn't look more detailed, since it would lose some effects/filters.

Ideally, everyone would prefer 1080p over 900p.
 
If the game runs at 900p it means that with the same settings it can't run at a full 1080p.

So no, the game at 1080 wouldn't look more detailed, since it would lose some effects/filters.

Ideally, everyone would prefer 1080p over 900p.

What? We're talking about the theoretical graphical differences if the machine was able to push 1080p with the same number of shaders, and possibly without the polygon cut.
 
Comparing E3 to the latest build, I think the game looks a lot better now. It's seemed to me for a while that XBO was more designed to target 1600x900 instead of full 1080p. Every since we got the leak about the upscaler and the display planes which can be used to maintain a full HD HUD, negating one of the more obvious issues with scaling, it seemed like that was the approach MS was going for.

For me right now it's a toss up between this and Killzone for best looking launch games. Both seem way ahead of everything else.

As long as you don't count pixels or stare at 8x zoomed screenshots instead of actually playing the game, you should be a happy camper with whatever box you buy.
 
What? We're talking about the theoretical graphical differences if the machine was able to push 1080p with the same number of shaders, and possibly without the polygon cut.

But since it can't, there's no point.

And for the theoretical difference, of course 1080 will look more detailed. What's the point in discussing that?
 
Dark10x played this game on Gamescom, i would love to hear his opinion about IQ.
BlimBlim saw it too.

========

And people should stop trying to upscale shots, because it wont looked like that.
Also 1080p for combat vid is not confirmed.
I thought the game looked quite nice but the image quality wasn't great. It seemed like a lower resolution but, at the same time, the TVs were all poorly optimized lcd displays. It was hard to know whether to chalk up visual anamolies to the game or the display. Image quality across all nextgen games was less than optimal, I thought. Killer Instinct and Ryse were definitely the worst offenders, though.

I actually had fun playing Ryse, though, and thought it looked lovely outside of blurry image quality (which is still a big leap over 720p).
 
Avatar is also 24 FPS. So, extending your logic, would you be happy with your game at 24 FPS if the IQ was better?

There difference there is framerate has a direct effect on gameplay, the visuals are just the visuals and don't affect how the game plays. Having said that, it's not like 30fps is much faster.
 
As long as you don't count pixels or stare at 8x zoomed screenshots instead of actually playing the game, you should be a happy camper with whatever box you buy.

Agreed. I think the differences will be noticeable but it's not like the XBO isn't powerful enough to be considered a generation ahead of the 360. It clearly is.
 
very unfortunate indeed. Game still looks good though!


I think we're already seeing the power difference now between PS4 and X1. KZ: SF looks half a generation ahead and is holding that at 1080p. Knack, DC, and Resogun are all 1080p as well.


This isn't a bad thing at all though for Ryse, especially IF they can really use the compromise in resolution to optimize effects.

Now that's some high grade hyperbole right there!
 
There difference there is framerate has a direct effect on gameplay, the visuals are just the visuals and don't affect how the game plays. Having said that, it's not like 30fps is much faster.
30 fps is a multiple of the refresh rate. On most displays 24 fps results in judder (unless you have a 72 hz 3:3 pull down option).

Probably the best looking launch game. DEAL WITH IT
On xb1? Yep.
 
I usually like reading the hottest threads on gaf for the day, little sad they're all about resolution lately.

Just found this. Hope this softens the atmosphere.
new-years-resolutions-twentysomethings.jpg
 
I thought the game looked quite nice but the image quality wasn't great. It seemed like a lower resolution but, at the same time, the TVs were all poorly optimized lcd displays. It was hard to know whether to chalk up visual anamolies to the game or the display. Image quality across all nextgen games was less than optimal, I thought. Killer Instinct and Ryse were definitely the worst offenders, though.

I actually had fun playing Ryse, though, and thought it looked lovely outside of blurry image quality (which is still a big leap over 720p).

Yeah, i know that TV setups on such conventions are awful generally, but You can always compare to other know 1080p titles that are shown on similar awful TV setups :) And You did, thanks for info.
 
Strange... I seem to remember that the earlier video was 1080p confirmed? Why not just admit that resolution was lowered to fill the game with awesome-looking effects?

I, for one, care less about native resolution than I do about packing a game with great lighting, textures, and particles. And that's coming from someone who sits seven feet from a 65" tv!

1600x900 is noticeably better resolution than the majority of games this gen, so as long as there's no case where the IQ gets as bad as GTAV (for me, I get headaches from how it looks on my massive TV) I'll be good. I dealt with the jaggies on TLoU and Halo 4 just fine.

In the end, "Full HD" implies 1080p native. His use of the term was wholly inappropriate at best, outright lying at worst.

No, 1080p was never confirmed. And their entire point is that they didn't lower the resolution, you're correct about native resolution not mattering that much but their statement is important because many believe a downgrade in resolution was made.

And all that was said was "Full HD experience", this does not imply native 1080p and actually makes perfect sense for it being 900p upscaled. It was neither inappropriate nor lying. Some were misled because they didn't notice the "experience" qualifier and made assumptions that ended up not being true.

It's really not a big deal at all that it's not 1080p with all the 720p Xbox One games, especially with the visual quality Ryse has.
 
I usually like reading the hottest threads on gaf for the day, little sad they're all about resolution lately.

Pretty sure this counts as thread whining.

Telling people to sit farther from their TV is great, but then you run into situations like mine:
I bought a massive TV when I lived in my old apartment with a roommate. Then I moved into a smaller one by myself. I have arrived at a situation where the absolute farthest I can sit from my TV is 7 feet. So I can definitely notice upscaled resolutions on my games. It's to the point where GTAV looks abysmally bad from an IQ standpoint, and I can't play it for more than an hour or two before getting a headache. To that end, I've switched the game over to my 32" computer monitor, which greatly alleviated the problem.

I acknowledge that the IQ should never get as bad this upcoming gen as with the current gen, as 900p is a far cry from sub-720p. I'm just saying that just because it's not a big deal for some, it is certainly a bigger deal for others.
 
Pretty sure this counts as thread whining.

Telling people to sit farther from their TV is great, but then you run into situations like mine:
I bought a massive TV when I lived in my old apartment with a roommate. Then I moved into a smaller one by myself. I have arrived at a situation where the absolute farthest I can sit from my TV is 7 feet. So I can definitely notice upscaled resolutions on my games. It's to the point where GTAV looks abysmally bad from an IQ standpoint, and I can't play it for more than an hour or two before getting a headache. To that end, I've switched the game over to my 32" computer monitor, which greatly alleviated the problem.

I acknowledge that the IQ should never get as bad this upcoming gen as with the current gen, as 900p is a far cry from sub-720p. I'm just saying that just because it's not a big deal for some, it is certainly a bigger deal for others.

Do you get a headache watching TV broadcasts in 720p at that distance? Hell, 480p?

Whatever you think your issue is, it's not that.
 
No, 1080p was never confirmed. And their entire point is that they didn't lower the resolution, you're correct about native resolution not mattering that much but their statement is important because many believe a downgrade in resolution was made.

And all that was said was "Full HD experience", this does not imply native 1080p and actually makes perfect sense for it being 900p upscaled. It was neither inappropriate nor lying. Some were misled because they didn't notice the "experience" qualifier and made assumptions that ended up not being true.

It's really not a big deal at all that it's not 1080p with all the 720p Xbox One games, especially with the visual quality Ryse has.

And I absolutely agree that Ryse blows all the other Xbox One launch games out of the water. It does this from a resolution standpoint (except for Forza) AND an effects and general-next-gen-shininess standpoint (especially DR3 and Forza). The only thing I'm not sold on is the gameplay.
 
I would love for you to take a double blind test using 900 vs 1080 while playing a game you've never played before in full motion on a 47 inch TV sitting 8 feet away. Just normal game playplay conditions. You know, not staring at a wall or carpets. I'd be willing to lay major odds you wouldn't do so well.

Some people can notice a lot of things. I've never tried 120hz gaming so I have no idea how I would do here, but I linked it to show why some swear by it.

There is no such thing as "normal gameplay conditions" though. My "normal gameplay condition" is around a meter or so away from a 23inch 1080p screen. I personally find that games get a lot more immersive when the screen covers a large part of my cone of vision and being slightly near sighted I prefer to sit close enough not to need glasses. While editing the photos I only zoomed to 100% to see what was what and starting Skyrim in 900p made the game look much softer.

One important point to mention is that I took small shots to show how the texture suffered, but those shots don't show how the overall image looked. For me going from 900p to 1080p is kinda like taking on my glasses on, but the effect is very muted in small 160p/360p images so I suggest watching them in full screen on a 1080p monitor.
 
Do you get a headache watching TV broadcasts in 720p at that distance? Hell, 480p?

Whatever you think your issue is, it's not that.

Actually, I do notice the diminished IQ of 720p broadcasts on my large tv. The image is especially fuzzy at resolutions below that. Just like I can tell when my Netflix SuperHD stream drops to 720p. It does make a difference, whether or not you want to acknowledge it.

And it's cool how you can diagnose my issues. I've had a pain in my left foot on and off for a couple of days, could you tell me what caused that, too? Thanks.
 
Top Bottom