• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox One | Ryse confirmed running native at 900p

Status
Not open for further replies.

Thrakier

Member
I've played a good 15 minutes of the game (still in development, of course), but Cerny himself told me that they are targeting 40 fps (which looks dreadful on a 60 Hz display). They should either go for a locked 30 or reduce details and resolution in order to hit 60, I think. His comment suggests they are on the wrong track, however.

Wait, Cerny himself said that they are going for 40 (!??!?!?!?) FPS?

Are you serious? This is frightening if true because it basically says that it knows fucking nothing about framerate and smoothness.
 

IN&OUT

Banned
WOW @ DF

After a generation of shredding PS3 over 5fps loss in multiplat, losing 1/3 of the pixels are no big deal now?!

Sigh :/
 

twobear

sputum-flecked apoplexy
Kudos to MS for coming clean about the resolution, even if it is just to avoid the otherwise inevitable post-launch 900pgate.

Disconcerting that Xbone developers are already running into technical limitations though. I wonder if 900p will be the go-to resolution for Xbone devs to minimise the visual disparity with PS4 at 1080p?
 

mocoworm

Member
Xbox One exclusive Ryse runs at 900p - Full HD off the table for Crytek's showcase. Digital Foundry assesses the news.

http://www.eurogamer.net/articles/digitalfoundry-ryse-runs-at-900p

"Microsoft has revealed that Xbox One launch title Ryse is not running at native 1080p.

Aaron Greenberg, chief of staff for the Devices and Studios Group, tweeted that the game will actually be running at "900p". Greenberg confirmed 1080p rendering for showpiece title Forza Motorsport 5, but it's already known that Killer Instinct - somewhat surprisingly - has targeted an internal 720p rendering resolution.

While some might be disappointed that some of the firm's first party exclusives aren't running at full HD, the situation is somewhat reminiscent of the Xbox 360 launch, where key titles like Project Gotham Racing 3 and Perfect Dark Zero failed to hit the native 720p target that Microsoft mandated in its own technical requirements.

In the case of Xbox One, the system's graphical performance level has been something of a moving target since near-final silicon shipped to developers - clock-speeds on the CPU and GPU have increased, while the software-side driver has seen considerable revision. Bearing in mind that games take upwards of two years to develop, decisions on rendering resolutions must surely have been taken prior to the console's hardware spec being set in stone.

Developer sources have also suggested that the 32MB of ESRAM - the fast scratchpad memory for high-speed graphics processing - may favour lower resolution render targets. This is a topic we hope to return to soon with some hard data from on-the-record sources.

So what does the reduction to resolution actually mean for image quality? It's a topic we recently covered in our article on how Xbox One games can compete with PS4, using Crysis 3 - one of the most visually rich of PC titles - as our test-bed. Coincidentally, this game hails from the same developer as Ryse and is based on the same underlying CryEngine technology. For the purposes of this test, we've assumed that "900p" sees the same level of scaling on the x-axis too, giving us a 1600x900 framebuffer.

While we'd clearly prefer 1080p as a target resolution, it's safe to say that the reduction in quality isn't quite as impactful as you might expect in these shots - surprising bearing in mind that we're looking at a one-third drop in overall resolution. In motion, larger, upscaled pixels may be more noticeable but persistent on-screen elements - like the HUD - are likely to be rendered at native 1080p.

The takeaway from our testing with the detail-rich Crysis 3 was that the situation isn't as much of an issue as it is on current-gen console, where sub-720p imagery can look really grim, particularly in concert with post-process anti-aliasing. Clearly though, final judgement on multi-platform next-gen titles that employ different resolutions will be reserved for our hands-on comparisons with final retail code.

Upscaling has come a long way since the current-gen consoles launched in 2005/2006, and we can imagine that developers of both Xbox One and to a lesser extent PS4 titles will employ sub-native framebuffers to hit their performance targets, especially in first-gen games. Given the choice between a consistent gameplay experience at 900p or 1080p with a frame-rate hit, we'd take the smoother performance every time."
 

nib95

Banned
Xbox One exclusive Ryse runs at 900p - Full HD off the table for Crytek's showcase. Digital Foundry assesses the news.

http://www.eurogamer.net/articles/digitalfoundry-ryse-runs-at-900p

"Microsoft has revealed that Xbox One launch title Ryse is not running at native 1080p.

Aaron Greenberg, chief of staff for the Devices and Studios Group, tweeted that the game will actually be running at "900p". Greenberg confirmed 1080p rendering for showpiece title Forza Motorsport 5, but it's already known that Killer Instinct - somewhat surprisingly - has targeted an internal 720p rendering resolution.

While some might be disappointed that some of the firm's first party exclusives aren't running at full HD, the situation is somewhat reminiscent of the Xbox 360 launch, where key titles like Project Gotham Racing 3 and Perfect Dark Zero failed to hit the native 720p target that Microsoft mandated in its own technical requirements.

In the case of Xbox One, the system's graphical performance level has been something of a moving target since near-final silicon shipped to developers - clock-speeds on the CPU and GPU have increased, while the software-side driver has seen considerable revision. Bearing in mind that games take upwards of two years to develop, decisions on rendering resolutions must surely have been taken prior to the console's hardware spec being set in stone.

Developer sources have also suggested that the 32MB of ESRAM - the fast scratchpad memory for high-speed graphics processing - may favour lower resolution render targets. This is a topic we hope to return to soon with some hard data from on-the-record sources.

So what does the reduction to resolution actually mean for image quality? It's a topic we recently covered in our article on how Xbox One games can compete with PS4, using Crysis 3 - one of the most visually rich of PC titles - as our test-bed. Coincidentally, this game hails from the same developer as Ryse and is based on the same underlying CryEngine technology. For the purposes of this test, we've assumed that "900p" sees the same level of scaling on the x-axis too, giving us a 1600x900 framebuffer.

While we'd clearly prefer 1080p as a target resolution, it's safe to say that the reduction in quality isn't quite as impactful as you might expect in these shots - surprising bearing in mind that we're looking at a one-third drop in overall resolution. In motion, larger, upscaled pixels may be more noticeable but persistent on-screen elements - like the HUD - are likely to be rendered at native 1080p.

The takeaway from our testing with the detail-rich Crysis 3 was that the situation isn't as much of an issue as it is on current-gen console, where sub-720p imagery can look really grim, particularly in concert with post-process anti-aliasing. Clearly though, final judgement on multi-platform next-gen titles that employ different resolutions will be reserved for our hands-on comparisons with final retail code.

Upscaling has come a long way since the current-gen consoles launched in 2005/2006, and we can imagine that developers of both Xbox One and to a lesser extent PS4 titles will employ sub-native framebuffers to hit their performance targets, especially in first-gen games. Given the choice between a consistent gameplay experience at 900p or 1080p with a frame-rate hit, we'd take the smoother performance every time."

I hope the 32mb ram amount of the Esram doesn't end up being only just too small for more useful implementation, similar to how the 10mb eDRAM was just shy of the amount needed for the desired free AA Microsoft wanted at 720p.
 

amardilo

Member
All of this resolution talk means barely anything to me on my 32in gaming TV lol. Highly doubt i'll notice the difference between 900p and 1080p.

I guess it really depends on the size of the screen and how far you are from it.

I currently have a 32" TV I got in 2005 it can't do 1080p and I sit about 6 feet away. With my setup it doesn't make a difference. I was planning on getting a new TV (37"-42" that can do 1080p) but I think I might hold off until more games like Forza which run at 1080p come out.
 

Hollow

Member
Xbox One exclusive Ryse runs at 900p - Full HD off the table for Crytek's showcase. Digital Foundry assesses the news.

http://www.eurogamer.net/articles/digitalfoundry-ryse-runs-at-900p

"Microsoft has revealed that Xbox One launch title Ryse is not running at native 1080p.

Aaron Greenberg, chief of staff for the Devices and Studios Group, tweeted that the game will actually be running at "900p". Greenberg confirmed 1080p rendering for showpiece title Forza Motorsport 5, but it's already known that Killer Instinct - somewhat surprisingly - has targeted an internal 720p rendering resolution.

While some might be disappointed that some of the firm's first party exclusives aren't running at full HD, the situation is somewhat reminiscent of the Xbox 360 launch, where key titles like Project Gotham Racing 3 and Perfect Dark Zero failed to hit the native 720p target that Microsoft mandated in its own technical requirements.

In the case of Xbox One, the system's graphical performance level has been something of a moving target since near-final silicon shipped to developers - clock-speeds on the CPU and GPU have increased, while the software-side driver has seen considerable revision. Bearing in mind that games take upwards of two years to develop, decisions on rendering resolutions must surely have been taken prior to the console's hardware spec being set in stone.

Developer sources have also suggested that the 32MB of ESRAM - the fast scratchpad memory for high-speed graphics processing - may favour lower resolution render targets. This is a topic we hope to return to soon with some hard data from on-the-record sources.

So what does the reduction to resolution actually mean for image quality? It's a topic we recently covered in our article on how Xbox One games can compete with PS4, using Crysis 3 - one of the most visually rich of PC titles - as our test-bed. Coincidentally, this game hails from the same developer as Ryse and is based on the same underlying CryEngine technology. For the purposes of this test, we've assumed that "900p" sees the same level of scaling on the x-axis too, giving us a 1600x900 framebuffer.

While we'd clearly prefer 1080p as a target resolution, it's safe to say that the reduction in quality isn't quite as impactful as you might expect in these shots - surprising bearing in mind that we're looking at a one-third drop in overall resolution. In motion, larger, upscaled pixels may be more noticeable but persistent on-screen elements - like the HUD - are likely to be rendered at native 1080p.

The takeaway from our testing with the detail-rich Crysis 3 was that the situation isn't as much of an issue as it is on current-gen console, where sub-720p imagery can look really grim, particularly in concert with post-process anti-aliasing. Clearly though, final judgement on multi-platform next-gen titles that employ different resolutions will be reserved for our hands-on comparisons with final retail code.

Upscaling has come a long way since the current-gen consoles launched in 2005/2006, and we can imagine that developers of both Xbox One and to a lesser extent PS4 titles will employ sub-native framebuffers to hit their performance targets, especially in first-gen games. Given the choice between a consistent gameplay experience at 900p or 1080p with a frame-rate hit, we'd take the smoother performance every time."

Leadbetter article?

Leadbetter article.
 
WOW @ DF

After a generation of shredding PS3 over 5fps loss in multiplat, losing 1/3 of the pixels are no big deal now?!

Sigh :/

What's so surprising? They clearly say framerate is more important than resolution, so it should be expected that they bitch about framerate more than about pixels.
 

scandisk_

Unconfirmed Member
Show them these lol. To me GT is still the only racer to achieve near photorealism in GIF's.

gt5time3kuj1.gif


ibyAZDpSr0zHfm.gif

OT, just imaging the next-gen GT runnin on PS4. oooohh sweet lawd jebus! GT's next-gen lighting gen would probably melt our faces.
 

KKRT00

Member
Leadbetter article?

Leadbetter article.

And whats wrong with that article?
Only leadbetter can come out and say 900p instead of 1080p doesn't make a difference if it's on xbox one...

He hasnt said that.

---
Yes, CryEngine 3 uses a g-buffer with a depth of 64 bits (Depth24Stencil8 and A8B8G8R8) per pixel [1]. That would be ~16MB at 1080p and ~11MB at 900p. The other buffers account for a total of ~30MB at 1080p on the PC, resulting in a total sum of ~46MB at 1080p. It might well be that the decision to go with 900p is because of ESRAM size limitations.

[1] http://de.slideshare.net/TiagoAlexSousa/secrets-of-cryengine-3-graphics-technology (slide 5)

I dont think it would help. 11mb in 900p is still more than You can fit into ESRAM, because ESRAM is divided into four 8mb chunks.
https://semiaccurate.com/assets/uploads/2013/08/XBO_diagram_WM.jpg
 

KidBeta

Junior Member
And whats wrong with that article?

---


I dont think it would help. 11mb in 900p its still more than You can fit into ESRAM, because ESRAM is divided into four 8mb chunks.
https://semiaccurate.com/assets/uploads/2013/08/XBO_diagram_WM.jpg

You can fit 32MB into the eSRAM the chunking has nothing to do with the max size you can other then the fact, that the max size is number of chunks * chunk size.

Its pretty obvious IMO that this was done to fit the gbuffer into the 32MB eSRAM.
 

Honey Bunny

Member
So, I'm not a sim enthusiast now? :) My wheels, pedals, rigs, and collection would beg to differ.

My bringing up my anecdotal observation was only in reference to the 'shit' and 'current gen' comments. Mine isn't an argument about what looks more realistic - rather, it is the overly dismissive and reductive nature of these discussions that quickly reach hyperbolic levels. This does service to no-one.

No, I was making the assumption your colleague isn't a sim enthusiast. I thought you were using his comment as a cheap way of dismissing what are, to me, details worth discussing on a forum like this. That and seeing someone proclaim that dynamic weather and time isn't going to be a big deal for sim enthusiasts either made my head spin a little bit, my bad that I misinterpreted.
 

dark10x

Digital Foundry pixel pusher
Wait, Cerny himself said that they are going for 40 (!??!?!?!?) FPS?

Are you serious? This is frightening if true because it basically says that it knows fucking nothing about framerate and smoothness.
Yes, he told me this.

He said that they certainly would like to have hit 60 fps for the game but won't be able to deliver it at launch. He stated 40 fps with triple buffering. His reasoning was that platform games benefit from the highest framerate possible so they'd rather unlock the framerate and aim for 40 fps (which no doubt means unlocked, though he did not specifically state that).

The demo I played did have pockets of 60 fps but generally ran similarly to God of War Ascension. It might seem troubling that the game being produced by the chief architect of the system would have such difficulties, but honestly, I suspect we're still seeing limitations of the Japanese studios mixed with the fact that it's a launch game.

He certainly seemed to agree that they hope to see 60 fps become more common but, in the case of Knack, it wasn't going to be possible by launch.

That said, a lot of the lesser known PS4 titles shown were all running at 60 fps on devkit hardware, which was encouraging. Only the "big" titles seem to be settling for less.
 

IN&OUT

Banned
What's so surprising? They clearly say framerate is more important than resolution, so it should be expected that they bitch about framerate more than about pixels.

They nitpick everything framerates, resulotions AA last gen ....I'm sensing softness of their tone knowing that X1 is embarrassingly weak compared to PS4.

Their GTAV face off is due today...It would be the last horrah for Xbox brand ( at least last game they could skew their judgement). The next 6 years would be difficult for Leadbetter.
 

NBtoaster

Member
Definitely not smart nor a realistic shortcut. Drive your car around and check whether the car in front is reflected on the hood.

The actual term is "cop out"

Screen space reflections are not cop outs, they're an advanced technique that almost every game is using next gen. Killzone SF, Ryse, InfamousSS, probably DC too.

The player name is reflecting because it's part of the 3D scene. It's not a sign of technical weakness.
 

Vitor711

Member
Someone should tell Crytek that using a lower resolution to upscale it is horrible. The game will look bad, no matter how much sfx you put in the game. If the game has shitty rendering it will look bad.

It should look glorious on 720p TV though. But it won't be as pretty as native 1080p that's for sure.

http://www.eurogamer.net/articles/digitalfoundry-ryse-runs-at-900p

I do take issue with the comparison where they say that 1080vs900p won't be a huge difference.

I had to lower the res of some games from native 1080 to 900 on my old laptop and it was certainly enough to notice. Not sure how much upscaling improvements can mask that but native was always better.
 

KKRT00

Member
You can fit 32MB into the eSRAM the chunking has nothing to do with the max size you can other then the fact, that the max size is number of chunks * chunk size.

Its pretty obvious IMO that this was done to fit the gbuffer into the 32MB eSRAM.

Why it has nothing to do with it? You have to tile the buffer to fit it into those RAM chunks.
And if You have to tile buffers You can easily tile them to fit bigger buffers too.

--
And nevermind those buffers are old news, they've changed buffers a little in CE 3.5.
 
They nitpick everything framerates, resulotions AA last gen ....I'm sensing softness of their tone knowing that X1 is embarrassingly weak compared to PS4.

Their GTAV face off is due today...It would be the last horrah for Xbox brand ( at least last game they could skew their judgement). The next 6 years would be difficult for Leadbetter.

If Leadbetter has a bias, and I'm not sure he does, it's towards a lot of clicks. He doesn't care about which console He's championing as long as he gets readers.
 

Seanspeed

Banned
http://www.eurogamer.net/articles/digitalfoundry-ryse-runs-at-900p

I do take issue with the comparison where they say that 1080vs900p won't be a huge difference.

I had to lower the res of some games from native 1080 to 900 on my old laptop and it was certainly enough to notice. Not sure how much upscaling improvements can mask that but native was always better.

Its might be noticeable if you go from one right to the other. But I guarantee the vast majority of people will not be able to tell from the get-go. And yes, upscaling does make a difference, too.

Anybody who wants to make this out to be a big deal is probably a fanboy.
 

KidBeta

Junior Member
Why it has nothing to do with it? You have to tile the buffer to fit it into those RAM chunks.
And if You have to tile buffers You can easily tile them to fit bigger buffers too.

The programmer isn't the person who deals with 8MB chunks thats up to the hardware guys, what happens is you read and write to addresses that are in the eSRAMs range which is probably a 32MB range of memory.

The reason there is 4 8MB chunks is to increase the perf, the addresses are probably interleaved between all 4 as well.
 

Thrakier

Member
Yes, he told me this.

He said that they certainly would like to have hit 60 fps for the game but won't be able to deliver it at launch. He stated 40 fps with triple buffering. His reasoning was that platform games benefit from the highest framerate possible so they'd rather unlock the framerate and aim for 40 fps (which no doubt means unlocked, though he did not specifically state that).

The demo I played did have pockets of 60 fps but generally ran similarly to God of War Ascension. It might seem troubling that the game being produced by the chief architect of the system would have such difficulties, but honestly, I suspect we're still seeing limitations of the Japanese studios mixed with the fact that it's a launch game.

He certainly seemed to agree that they hope to see 60 fps become more common but, in the case of Knack, it wasn't going to be possible by launch.

That said, a lot of the lesser known PS4 titles shown were all running at 60 fps on devkit hardware, which was encouraging. Only the "big" titles seem to be settling for less.

Well, my problem here is less that Knack isn't running at 60FPS, however it seems weird considering the graphical impression the game gives so far. It looks like it should run at 120FPS.

No, my problem is, that the chief architecht of PS4 seemingly doesn't know that an unlocked 40FPS framerate is a stuttery mess while a locked 30FPS with motion blur may be acceptable to some regarding motion judder. That doesn't bode well for NextGen framerates.
 

Kuro

Member
Only game confirmed to be running below 1080p on the PS4 is BF4, but the devs have said the resolution is not set in stone yet and they are still optimising.

The Order is slightly below 1080p and the same for Evil Within because they're going for a more cinematic aspect ratio/
 
Leadbetter article?

Leadbetter article.

I don't get it, what the guy says is mostly accurate. The only remotely controversial thing he said is that to a lesser extent devs may reduce resolution on PS4 titles to meet their performance goals. That's just slightly controversial. The rest is pretty damn basic stuff. This is very reminiscent of the 360 launch when certain titles didn't hit a specific resolution. He's far from wrong about this. I find it hilarious that people have an issue with him stating this. On top of that, who wouldn't choose lower resolution and smoother gameplay over higher resolution with bad performance?

And he's also correct in stating that this isn't like last gen. We are no longer talking about the difference between 720p or non-hd resolutions that are much more blurry or lacking in IQ. We are talking 1080p or slightly lower resolutions, such as 900p. These systems can do any number of things to make a game look amazing at a lower resolution. A perfect example is that nobody even knew Ryse was doing this the entire time. That pretty much says it all right there. It's common sense that the negative consequences of not running native 1080p aren't as severe as running sub 720p when resolution is likely to still pretty high regardless, or even when devs start using dynamic resolutions, and even then till this day I still consider Alan Wake, a super sub hd xbox 360 game, one of the most amazing looking current gen games I played, so it really is all up to what your definition of a great looking game is. People can get caught up in the numbers if they wish.

Anyway, I'm heading out. God, I hate the days when I have to leave for work early :(
 

Hollow

Member
And whats wrong with that article?

Nothing's wrong with it per se, I just find it a little ironic that a guy who nitpicks over every pixel in face offs suddenly doesn't feel like the difference between 900p and 1080p will be that large do to some upscaling.
 

gruenel

Member
Well, my problem here is less that Knack isn't running at 60FPS, however it seems weird considering the graphical impression the game gives so far. It looks like it should run at 120FPS.

No, my problem is, that the chief architecht of PS4 seemingly doesn't know that an unlocked 40FPS framerate is a stuttery mess while a locked 30FPS with motion blur may be acceptable to some regarding motion judder. That doesn't bode well for NextGen framerates.

Play God of War 3.

It's fine.
 

Jack cw

Member
Well, my problem here is less that Knack isn't running at 60FPS, however it seems weird considering the graphical impression the game gives so far. It looks like it should run at 120FPS.

No, my problem is, that the chief architecht of PS4 seemingly doesn't know that an unlocked 40FPS framerate is a stuttery mess while a locked 30FPS with motion blur may be acceptable to some regarding motion judder. That doesn't bode well for NextGen framerates.

Maybe you should apply for the lead game designer at Sony Studios Japan and show Mark how it really works :)
And when youre there - Try to boost that CPU speed of the jaguar!
 

LTWheels

Member
Yeah a 1,94 TF machine for 399$ is so weak.

What's price got to do with anything?

If next gen was just keeping the same graphic fidelity as the 360/ps3, then these consoles will be strong enough for 1080p 60fps. See, the cross gen ports.

Now if you want to push the graphics beyond what the 360/ps3 and have a constant 1080p 60fps in the majority of game; then yes, 1.94tf with what it essentially a laptop cpu is weak.
 

eso76

Member
In this thread:
" game looks stunning until you're told it shouldn't"


if you want to push the graphics beyond what the 360/ps3 and have a constant 1080p 60fps in the majority of game; then yes, 1.94tf with what it essentially a laptop cpu is weak.

but I've seen at least one (launch) game pushing graphics FAR beyond 360/ps3 level at 1080p/60fps.
with fewer tf's....
 

IN&OUT

Banned
Yeah a 1,94 TF machine for 399$ is so weak.

Yeah, I noticed that xbox fanboys hiding behind PC and trying to put PS4/X1 in the same ballpark of weakness.....sad act really.

Scnario is like this:

News: Ryse is sub-1080p
X1 fanboys: these consoles are weak, get PC !
 
Status
Not open for further replies.
Top Bottom