• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Techland: Xbox One's ESRAM "really smoothes out the optimization process"

Indeed, the game has some pretty open scenes, for instance the colosseum. The game also has the best facial animations and vegetation I have seen so far.

Nevertheless, I fail to see how Ryse can be an argument in the assessment of the ESRAM architecture. It's 900p after all, and such limitations of render target sizes are precisely what most of us were expecting as a consequence of the XBO's memory architecture.

So are they fully explorable or are they just there for decoration and smoke and mirrors to disguise a small arena joined on to a corridor?
 
Indeed, the game has some pretty open scenes, for instance the colosseum. The game also has the best facial animations and vegetation I have seen so far.

Nevertheless, I fail to see how Ryse can be an argument in the assessment of the ESRAM architecture. It's 900p after all, and such limitations of render target sizes are precisely what most of us were expecting as a consequence of the XBO's memory architecture.
And if you slowed BF4 down from 60 to a sub 30 frame rate, imagine what kind of textures and effects you could add to it.
 
So are they fully explorable or are they just there for decoration and smoke and mirrors to disguise a small arena joined on to a corridor?

The colosseum is fully explorable. Other levels have a linear design gameplay-wise, but rendering-wise the drawing distance is beyond "confined corridors". For instance, in the second half of the forrest level, you climb up a big aqueduct with free sight of the entire environment: http://www.youtube.com/watch?v=AnKY4k-28Ew

It's fair to mention that many, if not most, screenshots posted here over the last months came from pre-rendered cutscenes, especially big scenes like the sacking of the fortress at the coast or the flight over the battle in burning Rome. Nevertheless, while the actual in-game environments do have confined scenes (e.g. the forrest, the streets of Rome, ...), there are also some more open scenes. They are not as impressive as the scenes shown in those cutscenes, and they have less detail compared to scenes like the forrest, but they are still pretty.
 
Petty much agree with this. No console game is even close to looking as good as Ryse. Actions speak louder than words.
How can you say Killzone isn't even close? I'd argue it surpasses it. Holding up in first person is more impressive because you get much closer to textures and models.
 
It's all bollocks anyway, GDDR5 has pretty much the same response times as DDR3, it's the memory controllers on PC video cards that create the latency as quick response is not needed. That's not the rams fault.

PS4 uses custom memory controllers designed for the job they need to do.

That's even better than I thought
 
Did you played it? Because I can't see anyone that has played the game saying the environments are confined, enclosed and that's there is nothing intense going on...

Are you really trying to imply it's an open world game?

As far as having problems with the frame rate I am not the only one.

Digital Foundary:
Ryse misses the mark more often than we'd like with frame-rates often fluctuating between 26-28fps. The most challenging situations even see the frame-rate drop into the teens

You can tell who hasn't played Ryse. Ryse doesn't have low frame rate, at all.

Keep fighting the good fight.
 
I noticed nobody did make the analogy between PS2 memory architecture with X1.

Technically X1 memory architecture is not unified but has 2 pools of memory one big for CPU and GPU and the other, the esram for GPU. It is closer to the PS2 than the X360.

On X360 the Edram was really just a framebuffer, some specialized memory destined only for one GPU specific operation.

On X1. The Esram is just really a Video ram. Similar to the Vram on PS2. The first problem is, do you remember the praises it got from the devs? mainly negative stuff that the Vram was fast but too small and devs really struggled with it. Remember than when the dreamcast had 8MB of vram, PS2 released 1 year later with only 4MB. Remember they were talking about those low latencies of the PS2 vram...

But PS2 comparatively to X1 had tons of vram! 4MB versus 32MB of main ram!
When X1 Vram (let's just call it by what it is really) is only 32MB VS 8000MB-ish main ram.

That's a regression in memory architectures levels. When PS2 has evolved from |PS2-4MB/32MB| to |PS3-256MB/256MB| to |PS4 real unified memory 8GB, final form| the X1 regressed directly to PS2 architecture levels.

Like regressed into a subconscious hidden wish in a freudian dream.
 
Petty much agree with this. No console game is even close to looking as good as Ryse. Actions speak louder than words.

Keep telling yourself this. Someday it will--wait, no it won't.

KZ is just doing a lot more in a semi-open environment. It's not even arguable how far above in the main tech specs it is: 1080p native. Huge. TSSAA is great but so does Ryse have some decent AA. The most telling part is the framerate. KZ simply puts Ryse to sleep here. No doubt whatsoever. People are going to be on this Ryse thing for ages because when all you have is one straw to grasp you try your best to grab it.
 
he right .. the only times the frame rate drops is right out of FMV cutscenes.

I definitely encountered occasional framerate drops during gameplay. I remember one during the last battle in the "Scotland" chapter and in the forest level, IIRC. Nothing really tragic, and the game is smooth almost all the time, but it isn't perfectly locked at 30fps. But which game is...
 
Petty much agree with this. No console game is even close to looking as good as Ryse. Actions speak louder than words.

Don't confuse artistic direction for a testament of console power. Because in some ways, Last of Us looks better than Ryse.

3laLdpe.jpg


Now imagine what Last of Us would look like on the PS4.
 
Not really. Their is this thing called latency that DDR3 has less of than GGDR5 making making it far better for doing large numbers of tasks at once. DDR3 is better than GDDR5 for pretty much anything that isn't graphics which what GDDR was created for.

This has already been thoroughly debunked.
 
You can tell who hasn't played Ryse. Ryse doesn't have low frame rate, at all. Also, all attempts to downplay the graphical impressiveness of Ryse should be treated as the jokes that they are. The game looks nothing short of extraordinary, and it's pretty sad that due to console wars silliness people can't bring themselves to admit even that much, and have to find all sorts of ways to explain it away. I've completed a total of 4 chapters in that game, and encountered nothing approaching low framerate or serious frame drops. Ryse is one of the most impressive looking games I've ever seen on any platform, including PC. That's not me saying that nothing on the PC comes close, or nothing on the PS4 comes close, but I think Ryse, graphically, is quite an achievement no matter what you compare it to.

And I will never understand what telling us how much better Ryse would look on the PS4 accomplishes, as I've seen at least one poster say in this thread. What's the point of that? But I guess I knew what to expect with this thread, so yea. :)
I'm sure even if it did, with the number of enemies on screen it would be perfectly acceptable, right?
 
What's so good about blurring distant image? Especially when that distance is no more than 20 meters ahead.
If we're strictly talking about technical accomplishments, that's a pretty high-quality DOF effect. It's not like they just passed a blur filter over the image. Its absence in the XBO version is potentially indicative of less graphical power. Which is why it was brought up.
 
No, it's down to how the engine employs the effect coded for each platform (in this case, a bokeh effect, a high-quality example of which we have actually seen in Ryse, which not only employs a DOF effect, but a per-object motion blur, both simultaneously in this gameplay image:).

Indeed, Ryse uses bokeh quite frequently to mask low quality assets in the distance.
 
Weird hearing somebody say something positive about this.

However i watched the Dying Light videos at the VGX. When the Techland guy was stood with Geoff/Joel talking as the the video played they outright said it was 1080p and watching it, it was very smooth. Which surprised me because the Dead Island titles had consistent frame rate drops.

Then when they sat down to talk and started the next gameplay video, the frame rate was horrible.. graphics looked a little worse too. I was like "Damn it, techland are bullshitting.. or maybe this gameplay is the PS3 version" after watching i noticed xbox buttons popping up on screen. Take from this what you will. Maybe this was the 360 version.

Anyone else notice this?
 
Not really. Their is this thing called latency that DDR3 has less of than GGDR5 making making it far better for doing large numbers of tasks at once. DDR3 is better than GDDR5 for pretty much anything that isn't graphics which what GDDR was created for.

in absolute terms it'll only be about 10% or so worse than DDR3 worst case scenario but we have these things called caches which will mask the majority of that esp considering the 3x additional bandwidth

modern CPUs and GPUs are all very latency tolerant

was Esram a reactionary patch job as a response to Cerny's GDDR5?

no, it was in order to save money by going for cheaper memory - DDR3 was crazy cheap when they made this decision but then the SE Hynix fire happened and DDR3 prices went way up while GDDR5 prices dropped due to increased volume in other sectors

everyone knew 68 GBps shared with CPU isn't enough to feed a 7770
 
It doesn't seem like ESRAM/DDR3 latency is going to give much or any real world performance benefit. Same with upclocks, upscaling, offloading, or whatever is being used to claim "hardware parity" these days.

I expect to see 720p to 900p or 900p to 1080p on most multiplats this gen, plus whatever benefits devs can get out of PS4's unified memory and GPGPU customizations. Xbox One is still a capable console, just somewhat less powerful.
 
No, it's down to how the engine employs the effect coded for each platform (in this case, a bokeh effect, a high-quality example of which we have actually seen in Ryse, which not only employs a DOF effect, but a per-object motion blur, both simultaneously in this gameplay image:).

Though it's conspicuously missing in the 'Bone version of NFS, It can do bokeh DOF quite well.

Both of these machines are quite capable, and though the PS4 is indisputably more powerful, using a launch multi-platform and cross-gen engine as somehow indicative of each machine's capability or future potential is unwise.

But the problem is that Ryse is 900p. We still haven't seen a deferred engine running at 1080p on the X1. NFS on X1 lacks bokeh DOF, morphological AA and HBAO effects all applied on PC and PS4 versions.

I think it is not about GPU flops, because X1 GPU is decent. It is about memory architecture and framebuffers sizes.
 
But the problem is that Ryse is 900p. We still haven't seen a deferred engine running at 1080p on the X1. NFS on X1 lacks bokeh DOF, morphological AA and HBAO effects all applied on PC and PS4 versions.

I think it is not about GPU flops, because X1 GPU is decent. It is about memory architecture and framebuffers sizes.

Of course it's about GPU power. You just listed a bunch of compute-heavy post process effects. The 40% power advantage for the PS4 is more than enough to account for their omission, considering the rest of the game looks similar across both consoles.
 
But the problem is that Ryse is 900p. We still haven't seen a deferred engine running at 1080p on the X1. NFS on X1 lacks bokeh DOF, morphological AA and HBAO effects all applied on PC and PS4 versions.

I think it is not about GPU flops, because X1 GPU is decent. It is about memory architecture and framebuffers sizes.

NFS is still deferred, even without those effects.
 
Both of these machines are quite capable, and though the PS4 is indisputably more powerful, using a launch multi-platform and cross-gen engine as somehow indicative of each machine's capability or future potential is unwise.

What should we use?
 
But the problem is that Ryse is 900p. We still haven't seen a deferred engine running at 1080p on the X1. NFS on X1 lacks bokeh DOF, morphological AA and HBAO effects all applied on PC and PS4 versions.

I think it is not about GPU flops, because X1 GPU is decent. It is about memory architecture and framebuffers sizes.
How did you even get that information? Infact when DF analysed the visuals on both platforms, HBAO was missing on the PS4 version and was quickly added through a patch. The games look very similar and have the same AA, with both running at 1080p.

Ryse is a very very clean game, I would easily mistake this game for 1080p due to the AA being so good. On a big TV there's not one aliasing to be seen, at all. I imagine they dropped it to 900p to keep it clean and improve the shaders. Considering its a release game, they did an exceptional job.

Regarding the OP, MS have really had some bad luck when it comes to the RAM situation. DDR3 prices climbing exceptionally high due to factory fire and GDDR5 having the supply to be able to support 8GB on the PS4. Well probably would of struggled if both had GDDR.

If GDDR didn't come to the situation it was, then there would of been no chance the PS4 could of had 8GB of it and the generation would of been a very different one.

eSRAM does an exceptional job at holding zBuffers and doing post, with the only problem of being 32MB. People forget that there has been a lot of advancements in texture streaming recently, like tiling, which will only get used later on into the generation. Same goes with technology like tessellation, which has a heavy burden on the CPU.

I think the engineers will be kicking themselves when it comes to certain things, but MS are definitely not in a bad place. The 4 DMA's and SHAPE really do provide noticeable benefits to the box.
 
Uh oh! A thread where someone in the industry has something positive to say about the XB1. 10 pages incoming of experts telling them that they are wrong.

Anyone with an ounce of common sense has worked out that DDR3 with ESRAM is inferior to the PS4 solution of GDDR5 unified Ram and a much beefier GPU! But you can believe otherwise if it makes you feel better?
 
But the problem is that Ryse is 900p. We still haven't seen a deferred engine running at 1080p on the X1. NFS on X1 lacks bokeh DOF, morphological AA and HBAO effects all applied on PC and PS4 versions.

I think it is not about GPU flops, because X1 GPU is decent. It is about memory architecture and framebuffers sizes.

Those are the things devs can add to PS4 versions of multi plat games and will take very little time.

The XB1 memory setup is shite, but don't really contribute to those particular effects, they are pure gpu grunt.
 
Anyone with an ounce of common sense has worked out that DDR3 with ESRAM is inferior to the PS4 solution of GDDR5 unified Ram and a much beefier GPU! But you can believe otherwise if it makes you feel better?

Quote in OP never mentioned it being better. This post is about how an actual developer says the ESRAM helps him smooth out the optimisation process on XONE.

Way to fan the console war flames. Go you! , if it makes you feel better.
 
I imagine they dropped it to 900p to keep it clean and improve the shaders.
You wish. The only reason for going down in resolution was limitation. The game struggles to reach a constant 30 fps in 900p. With 33% more pixel fill rate to render in native 1080p the framerate would have been in single digit territory.
 
You wish. The only reason for going down in resolution was limitation. The game struggles to reach a constant 30 fps in 900p. With 33% more pixel fill rate to render in native 1080p the framerate would have been in single digit territory.
They had the game running in 720p, 900p and 1080p with them picking 900p. I imagine in 1080p they lost the clean IQ with the removal of certain effects.

Have you even played it? Ryse is the most visually clean and stunning game I've played on both platforms. If you've played it, you wouldn't even comment on the frame-rate because it isn't even noticeable.
 
They had the game running in 720p, 900p and 1080p with them picking 900p. I imagine in 1080p they lost the clean IQ with the removal of certain effects.

Have you even played it? Ryse is the most visually clean and stunning game I've played on both platforms. If you've played it, you wouldn't even comment on the frame-rate because it isn't even noticeable.

You can't compete "small" 900p with unstable fps, linear game like Ryse against much bigger 1080p game with unstable fps and with more stuff on the screen like KZ : SF

tCHZpBp.jpg


ShadowFall4.jpg~original


ShadowFall1.jpg~original


20131204172247.jpg


Ba2s2N1CYAE00bX.jpg:orig


20131208044809.jpg~original
 
They had the game running in 720p, 900p and 1080p with them picking 900p. I imagine in 1080p they lost the clean IQ with the removal of certain effects.

Have you even played it? Ryse is the most visually clean and stunning game I've played on both platforms. If you've played it, you wouldn't even comment on the frame-rate because it isn't even noticeable.

I imagine with 1080 they hit average framerate of 19fps .
 
How did you even get that information? Infact when DF analysed the visuals on both platforms, HBAO was missing on the PS4 version and was quickly added through a patch. The games look very similar and have the same AA, with both running at 1080p.

Digital foundry completely missed it. But PS4 and PC have a refined morphological AA versus raw blur on aliased edges for X1. The AA anyway is just applied for the 3 versions on the upper part of the image.

NFS_Lamp_post.png


No AA for the 3 versions on the bottom part of the image.

NFS_4_PS4_aliased_detail.png
 
Weird hearing somebody say something positive about this.

However i watched the Dying Light videos at the VGX. When the Techland guy was stood with Geoff/Joel talking as the the video played they outright said it was 1080p and watching it, it was very smooth. Which surprised me because the Dead Island titles had consistent frame rate drops.

Then when they sat down to talk and started the next gameplay video, the frame rate was horrible.. graphics looked a little worse too. I was like "Damn it, techland are bullshitting.. or maybe this gameplay is the PS3 version" after watching i noticed xbox buttons popping up on screen. Take from this what you will. Maybe this was the 360 version.

Anyone else notice this?
It was the PS4 version. The PC version was played by pewdiepie later, hence the xbox prompts.
 
They had the game running in 720p, 900p and 1080p with them picking 900p. I imagine in 1080p they lost the clean IQ with the removal of certain effects.

Have you even played it? Ryse is the most visually clean and stunning game I've played on both platforms. If you've played it, you wouldn't even comment on the frame-rate because it isn't even noticeable.
I played it and it has some great graphics but the levels are rather small and it's basically the same type of enemy all the time. It has great tech but only possible because it hides many of the bad things like those low res assets. The downside is the sub 30 framerate and all those fancy graphics are pretty worthless if it runs like that. They couldn't do it in 1080p with the same fidelity and 900p was the best they could do with this hardware and as I said while it looks good it runs rather unsatisfying.

Again: they couldn't run the game with this fidelity in 1080p on xbone and were forced to 900p, nothing else than this was the reason.
 
Top Bottom