• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox One | Ryse confirmed running native at 900p

Status
Not open for further replies.

nib95

Banned
http://www.eurogamer.net/articles/digitalfoundry-ryse-runs-at-900p

I do take issue with the comparison where they say that 1080vs900p won't be a huge difference.

I had to lower the res of some games from native 1080 to 900 on my old laptop and it was certainly enough to notice. Not sure how much upscaling improvements can mask that but native was always better.

With Crysis 3 and more recently with Bioshock Infinite, I preferred to drop settings here and there than the resolution to 900p. A few visual settings here and there you barely notice, but 900p makes the entire image slightly blurrier. I should add though, it's MUCH less noticeable on a HD TV at normal viewing distances than up close on a monitor, so despite my opinion that Leadbetter is bias and all too often privy to trying to defend or better the image of Microsoft, I think he's not too far off the money here.

Though I imagine if roles were reversed and the PS4 was on the receiving end he'd have heavily lambasted it instead of downplaying the difference. His most recent quotes about how the 10% cpu bump over the PS4 was "significant", whilst a ~40% GPU advantage with the PS4 was just "very evenly matched" got a laugh out of me.

Gotta spin those news. It's embarassing, really. Well, no Ryse for me then. Probably no Dead Rising 3 either if the dynamic resolution is true.

Do you usually base your game purchases off the resolution they're running at? :S
 
I refuse to shell out 60 bucks for what looked like a 6 hour QTE-fest with limited (re)playability, so im unmoved by the news. Forza 5 better fucking deliver the 1920x1080 goods tho. No upscaling.
 

jhendrix

Banned
Man I'm so glad I'm getting both consoles so this insane pixel counting and ego boasting contest does not matter to me. Also why did this turn into a Driveclub VS Forza thread? The amount of stress relief is worth the price of getting both consoles.

I'm still really looking forward to Ryse and from everything I've seen it looks amazing. I'm just glad in this gen we seem to be rid of Sub HD resolutions. Kudos for them being honest cause if they didn't say anything it would have been a blood bath after launch.
 

Vintage

Member
It's exciting how unexciting this "next" generation is. I was expecting some crazy performance or mind-blowing innovative features, but nothing comes close to it. PS4 is a bit better in performance front, XB1 is better in features, but both of them are still 'meh'.
I don't know why they imagine this will last for another 10 years.
 

Jack cw

Member
What's price got to do with anything?

If next gen was just keeping the same graphic fidelity as the 360/ps3, then these consoles will be strong enough for 1080p 60fps. See, the cross gen ports.

Now if you want to push the graphics beyond what the 360/ps3 and have a constant 1080p 60fps in the majority of game; then yes, 1.94tf with what it essentially a laptop cpu is weak.

The price detirmines what you get. And what you get is not so bad at least with PS4. And seriously man, every next gen game so far is a huge leap to current gen. You are free to spend your 1.500$ for your 3.5TF system :)
 

gaming_noob

Member
Yeah, I noticed that xbox fanboys hiding behind PC and trying to put PS4/X1 in the same ballpark of weakness.....sad act really.

Scnario is like this:

News: Ryse is sub-1080p
X1 fanboys: these consoles are weak, get PC !

Sad that you're grouping an entire fanbase like that.
 

IN&OUT

Banned
Well, my problem here is less that Knack isn't running at 60FPS, however it seems weird considering the graphical impression the game gives so far. It looks like it should run at 120FPS.

No, my problem is, that the chief architecht of PS4 seemingly doesn't know that an unlocked 40FPS framerate is a stuttery mess while a locked 30FPS with motion blur may be acceptable to some regarding motion judder. That doesn't bode well for NextGen framerates.

Well, this guy who you claim that he doesn't know shit has designed PS4 which is far more impressive than anything Redmond's "Tech fellows" would ever come up. So yeah, he knows what he's talking about and you are in no position to question what he said .
 

Amir0x

Banned
what the... this is... we still have to worry about sub-1080p resolutions on next-gen consoles!?

*sigh* priorities devs, please :(

+ Solid stable 30fps or more
+ 1080p native

Two criteria that should be met before any other goals :(
 

artist

Banned
Yes, CryEngine 3 uses a g-buffer with a depth of 64 bits (Depth24Stencil8 and A8B8G8R8) per pixel [1]. That would be ~16MB at 1080p and  ~11MB at 900p. The other buffers account for a total of ~30MB at 1080p on the PC, resulting in a total sum of ~46MB at 1080p. It might well be that the decision to go with 900p is because of ESRAM size limitations.

[1] http://de.slideshare.net/TiagoAlexSousa/secrets-of-cryengine-3-graphics-technology (slide 5)
Time to revive this thread?

http://www.neogaf.com/forum/showthread.php?t=674333
 

jhendrix

Banned
Sad that you're grouping an entire fanbase like that.

It's worth the price of both consoles just to be able to not give a damn about all this arguing crap. Being able to sit back and spectate this petty console war while looking forward to any next gen game is priceless.
 

netBuff

Member
Pretty crazy that Ryse doesn't even manage 1080p30 - it really doesn't look all that good, the flaws are extremely apparent in the recently posted trailer (flat, low resolution textures mainly).
 
Oh this sounds interesting from the Eurogamer Ryse article.

Developer sources have also suggested that the 32MB of ESRAM - the fast scratchpad memory for high-speed graphics processing - may favour lower resolution render targets. This is a topic we hope to return to soon with some hard data from on-the-record sources.

I would not be surprised if some developers are frustrated at the specs of the Xbox One and are ready to fuel GAF.
 

KKRT00

Member
Nothing's wrong with it per se, I just find it a little ironic that a guy who nitpicks over every pixel in face offs suddenly doesn't feel like the difference between 900p and 1080p will be that large do to some upscaling.

Where did he say that it isnt large? He just said that its not as much visible as with sub720p resolution and thats a fact. He even proved that with shots.

====

The programmer isn't the person who deals with 8MB chunks thats up to the hardware guys, what happens is you read and write to addresses that are in the eSRAMs range which is probably a 32MB range of memory.

The reason there is 4 8MB chunks is to increase the perf, the addresses are probably interleaved between all 4 as well.

Maybe its automated and just distributes by bits. You could be right. Still i though that have control over it, so maximize utilization of EDRAM, by for example storing one buffer in one of the chunks for few frames for some feature.

Btw its irrelevant though, because CryEngine 3.5 has different G-Buffer than Crysis 2.
Its 3x 32bit buffers, so its 24mb for 1080p.

ifdAHPktISTe8.jpg
 

IN&OUT

Banned
It's worth the price of both consoles just to be able to not give a damn about all this arguing crap. Being able to sit back and spectate this petty console war while looking forward to any next gen game is priceless.

Savor those moments, they peaked every 6 years lol
 

nib95

Banned
Oh this sounds interesting from the Eurogamer Ryse article.

Developer sources have also suggested that the 32MB of ESRAM - the fast scratchpad memory for high-speed graphics processing - may favour lower resolution render targets. This is a topic we hope to return to soon with some hard data from on-the-record sources.

I would not be surprised if some developers are frustrated at the specs of the Xbox One and are ready to fuel GAF.

It falls in line with the recent Edge article that also spoke of sources that were frustrated with the Esram.
 

eso76

Member
I think you mean "game didn't really look that stunning to begin with, now we find out one of the reasons why."

except IQ was NEVER a problem with Ryse, and your reply perfectly illustrates my point.

priorities devs, please :(

+ Solid stable 30fps or more
+ 1080p native

I agree on fps but prioritizing 1080p is just stupid.
IQ should be a priority, that I agree with, but 1080p alone doesn't warrant that.

Case in point, we just saw the 1080p video and absolutely no one noticed it was sub fullhd or complained about IQ, until someone told you.
 

LTWheels

Member
The price detirmines what you get. And what you get is not so bad at least with PS4. And seriously man, every next gen game so far is a huge leap to current gen. You are free to spend your 1.500$ for your 3.5TF system :)

I wasn't arguing about price. Just pure power.

I agree that the value/power proposition of the PS4 is extremely good and worthwhile.

It's because of that leap in graphics which makes 60fps at 1080p harder to obtain.

The power of these consoles is way overstated. I don't see why people are upset that the Ryse/KI can't be played at 60fps at 1080p on the XB1 or BF4 at 1080p 60fps on the PS4 etc.
 

IN&OUT

Banned
It falls in line with the recent Edge article that also spoke of sources that were frustrated with the Esram.

Maybe DF will figure out their grave mistake when they concluded that PS4 is only 25% faster than X1, basing that off Gpus equipped with Gddr5.
 
The vast majority of you would not have noticed 900p upscaled if you weren't told. It takes me back to one of the Halo games, I think it was, where somebody posted a zoomed in screenshot where they actually counted the number of pixels to prove it was upscaled. Before that, it was a fine looking game but suddenly became a blurry mess of pixels once you knew the "true" resolution.
 

KKRT00

Member
Maybe DF will figure out their grave mistake when they concluded that PS4 is only 25% faster than X1, basing that off Gpus equipped with Gddr5.

They have never said that PS4 is 25% faster ... They just tested that CU do not scale linearly, so 50% CU unit advantage do not equal 50% speed boosts. Its normal scientific research and they said its not conclusive in any means to real world scenarios of both consoles.
 

netBuff

Member
The vast majority of you would not have noticed 900p upscaled if you weren't told. It takes me back to one of the Halo games, I think it was, where somebody posted a zoomed in screenshot where they actually counted the number of pixels to prove it was upscaled. Before that, it was a fine looking game but suddenly became a blurry mess of pixels once you knew the "true" resolution.

I immediately noticed in the "beautiful 1080p footage" thread: And it will only get more noticeable when not judged from heavily compressed video.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Maybe its automated and just distributes by bits. You could be right.

The physical layout of the memory is irrelevant to the application, much the same way that the number of chips on the RAM sticks in PCs is irrelevant to every game (or any other application for that matter) running on that PC. The physical layout of memory is hidden by multiple layers of abstraction.

Btw its irrelevant though, because CryEngine 3.5 has different G-Buffer than Crysis 2.
Its 3x 32bit buffers, so its 24mb for 1080p.

And that is just the g-buffer.
 

Jack cw

Member
I wasn't arguing about price. Just pure power.

I agree that the value/power proposition of the PS4 is extremely good and worthwhile.

It's because of that leap in graphics which makes 60fps at 1080p harder to obtain.

The power of these consoles is way overstated. I don't see why people are upset that the Ryse/KI can't be played at 60fps at 1080p on the XB1 or BF4 at 1080p 60fps on the PS4 etc.
Fair enough. And you are right about 1080p at 60fps. Its more difficult to achieve but consoles still have the advantage of having the same tech and with enough optimaziation its possible to get at least 1080p at 30 fps out of those systems. I wouldnt call them "weak". They are designed to be efficient, not to have infinite horsepower.
 

Thrakier

Member
Maybe you should apply for the lead game designer at Sony Studios Japan and show Mark how it really works :)
And when youre there - Try to boost that CPU speed of the jaguar!

Well, really no reason to be ironic here. Do you know any good reason why a game should run at an unlocked framerate below 60FPS when it can be locked at 30 with VSync and Motion Blur? There is none. Unlocked framerate is always inferior and then a lead and game designer who designs a console and is in the industry for thousands of years goes for the shitty, juddery unlocked framerate...well then I lost all my faith. Really. At least he states "60FPS would be nice".
 

Chumpion

Member
Riddle me this: I don't understand why they would choose 1600x900 over 1280x1080. The latter has full vertical resolution, scales more cleanly, and has 4% less pixels.
 

szaromir

Banned
Now if you want to push the graphics beyond what the 360/ps3 and have a constant 1080p 60fps in the majority of game; then yes, 1.94tf with what it essentially a laptop cpu is weak.
PS4 doesn't have a "laptop GPU". There are like 5 laptop GPUs that exceed its performance and their cost to the user is astronomical.
 
Plesse tell me you don't actually fall for that excuse.

I don't, but the difference is that they've upfront stated from the get-go "this is what it'll be, deal with it."

Ryse was situation of "oops, sorry, not 1080, it's actually 900."

To me, the only 'news' in this whole thing is that u-turn. Ryse could definitely achieve 1080p if Crytek chose to cut down on the visuals, but if they do that, then Ryse literally has very little left going for it.
 

Krilekk

Banned
The vast majority of you would not have noticed 900p upscaled if you weren't told. It takes me back to one of the Halo games, I think it was, where somebody posted a zoomed in screenshot where they actually counted the number of pixels to prove it was upscaled. Before that, it was a fine looking game but suddenly became a blurry mess of pixels once you knew the "true" resolution.

No, the reason why it was found out with Halo 3 was because it looked much more blocky than it should've if it was running at 720p.
 

nampad

Member
At least Microsoft is honest about this and communicated it beforehand, which kind of surprises me.

Also Ryse news keep to be pretty entertaining ;)
 

Jack cw

Member
Well, really no reason to be ironic here. Do you know any good reason why a game should run at an unlocked framerate below 60FPS when it can be locked at 30 with VSync and Motion Blur? There is none. Unlocked framerate is always inferior and then a lead and game designer who designs a console and is in the industry for thousands of years goes for the shitty, juddery unlocked framerate...well then I lost all my faith. Really. At least he states "60FPS would be nice".

Its not like he is doing the tech alone. There is a whole team working on that game and they have regualr meetings and guys in charge for the engine etc. He is awared that 60 fps would be ideal. Everyone that knows a bit about motion on displays knows that you have to go for a multiple of 60Hz. This isnt his first game, dude. Dont be so dramatic, because you heared something from someone who heared this from Cerny.
 

Durante

Member
And he's also correct in stating that this isn't like last gen. We are no longer talking about the difference between 720p or non-hd resolutions that are much more blurry or lacking in IQ. We are talking 1080p or slightly lower resolutions, such as 900p. These systems can do any number of things to make a game look amazing at a lower resolution. A perfect example is that nobody even knew Ryse was doing this the entire time. That pretty much says it all right there.
It's exactly the same as last gen. The reason no one knew for certain is that all the material they released so far was either bullshit, downscaled or terribly compressed.

And in fact, I pointed out some questionable IQ in a Ryse thread a while ago. It was noticeable even through the compression.

In this thread:
" game looks stunning until you're told it shouldn't"
Ryse never looked stunning, even compared to Crytek's own Crysis 3.

The vast majority of you would not have noticed 900p upscaled if you weren't told. It takes me back to one of the Halo games, I think it was, where somebody posted a zoomed in screenshot where they actually counted the number of pixels to prove it was upscaled. Before that, it was a fine looking game but suddenly became a blurry mess of pixels once you knew the "true" resolution.
If you're talking about Halo 3, it was always a horribly bloody mess in terms of IQ, no pixel counting required. At all.

Well, really no reason to be ironic here. Do you know any good reason why a game should run at an unlocked framerate below 60FPS when it can be locked at 30 with VSync and Motion Blur?
Better responsiveness? On PC, I greatly prefer a variable 45+ framerate to locked 30. (Of course, that's particularly applicable on 120 Hz displays where the frame slots and thus temporal judder are smaller)
 

mjswooosh

Banned
Considering that the gameplay looks like a simplistic button masher + QTE fest yawner, I'd say the none-too-surprising downscale to 900p is the least of Ryse's problems.

MS made too many cost-conscious compromises on XBone's specs bc they believe(d) Kinect 2.0 is worth the pack-in expense. Expect the Ryse downscale story to become more the rule, not the exception, as the next gen progresses. Without moneyhatting to purposefully bork PS4 versions, the Xbone is going to increasingly have trouble keeping up. This really isn't news. Its called *math*.
 

velociraptor

Junior Member
Guess that explains why the visuals look so good. And it looks like EDGE was right. The Xbox is probably going to have many games running at 900p.
 

szaromir

Banned
It's exactly the same as last gen. The reason no one knew for certain is that all the material they released so far was either bullshit, downscaled or terribly compressed.
They released a 1080p video with very little compression just a couple days ago and people were praising the image quality of the game.
 

LTWheels

Member
Fair enough. And you are right about 1080p at 60fps. Its more difficult to achieve but consoles still have the advantage of having the same tech and with enough optimaziation its possible to get at least 1080p at 30 fps out of those systems. I wouldnt call them "weak". They are designed to be efficient, not to have infinite horsepower.

You say that, but if you consider this generation, graphics sell not resolution. As this generation went on, game developers improve the graphics at the cost of the resolution in order to maintain 30fps. Look at how many sub-720p games are out on current consoles.

I can see by the end of next gen that we will be seeing a lot of 720p 60fps games as devs push the graphical capabilities of the machines.

PS4 doesn't have a "laptop GPU". There are like 5 laptop GPUs that exceed its performance and their cost to the user is astronomical.

CPU, not CPU.
 

dark10x

Digital Foundry pixel pusher
Better responsiveness? On PC, I greatly prefer a variable 45+ framerate to locked 30. (Of course, that's particularly applicable on 120 Hz displays where the frame slots and thus temporal judder are smaller)
Oof, really?
 

madmackem

Member
Plesse tell me you don't actually fall for that excuse.

You know otherwise?. Man leadbetter is just lol worthy these days, some of the shit he comes out with is kray. Ohh its not all that look we took some pics off a pc game, he makes df lose creditability with each article.
 

KKRT00

Member
The physical layout of the memory is irrelevant to the application, much the same way that the number of chips on the RAM sticks in PCs is irrelevant to every game (or any other application for that matter) running on that PC. The physical layout of memory is hidden by multiple layers of abstraction.
I've edited my post with what i had in mind :)

And that is just the g-buffer.

Yeah, but those are the most memory intensive.
So we have G-Buffer with that as 3x 8mb buffers
Deferred buffers - 2x 8mb buffers (2x R11G11B10F)
Final Buffer - 16mb (R16G16B16A16F)

You can easily first fit G-Buffers, calculate then, and then load Deferred and Final buffers into ESRAM.
 
Status
Not open for further replies.
Top Bottom