• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

CBOAT: ESRAM handicap for now, but will get better

Wait, they're still releasing native 1080p media? :/

So it's a PC build then presumably?

Or alternatively cutscenes are pre-rendered at higher IQ using in-game assets?

Well, I believe the poster who wrote the post, Also I looked at the same lines as him and I came to the same conclusion that it was native 1080p, If Microsoft have said specifically this is from an Xbox One, then It won't be a PC build, that would be illegal to say and would be very silly to get into trouble for something so minor. I personally think native 1080 cutscenes will be on the XB1 for Ryse.
 
I wouldn't judge a game's quality based on some pre-rendered trailer. Go try to find an xb1 kiosk and play the game for yourself. You need to see the game playing on their xb1 hardware.

I'm going to see it tonight, unfortunately its the version before the upclock. Regardless, i still don't believe its prerendered.
 
Would MS have even been better off with discrete CPU and GPU, and split memory pools, say 3GB main RAM and 5GB GDDR5?
 
Not nearly as much, probably. It's a conservative design based on a very common and well known architecture, exceptionally well documented and with solid and robust tools available. The more unconventional a platform is, the bigger the room for growth. It's by far the least challenging system out of the three.
Xbox One isn't unconventional, it's just poorly designed.

There's nothing unconventional about the Xbone especially for console developers. PS2 and PS3 - now those were unconventional.
 
Well, I believe the poster who wrote the post, Also I looked at the same lines as him and I came to the same conclusion that it was native 1080p, If Microsoft have said specifically this is from an Xbox One, then It won't be a PC build, that would be illegal to say and would be very silly to get into trouble for something so minor. I personally think native 1080 cutscenes will be on the XB1 for Ryse.

Could be pre-recorded video cutscenes like Uncharted used in the past.
 
I'd prefer to take the word of people in the know rather than someone on a forum doing some maths

tumblr_lkwpv2eDdF1qa2btu.gif
 
They have very little actual performance to show for their transistor budget, so it's poorly designed.

Given that they had to design an architecture around DDR3, I think that it is a very good design. Microsoft's engineers are not stupid. If anything deserves criticism, it's the management that made the requirements (Win8/media multitasking, Kinect, ...) and apparently underestimated the schedule.
 
Could be pre-recorded video cutscenes like Uncharted used in the past.

When I said this before I was told to stop being a PS4 fanboy. But if you look at the latest trailer there's a big difference between the in-game stuff and the cutscene stuff.
 
.... are you even trying to understand?
They have very little actual performance to show for their transistor budget, so it's poorly designed.

But I'll humor your attempt at deflecting the logic:
the cost is decided by the size of the die
a 6.1billion transistor die on the same process node comes from the same wafers that would be used for the 5billion transistor die for the xbone apu
yields would be somewhat lower as a bigger die = more chance of having a fault on each die cut from the wafer.

20 percent bigger die for the gpu would cost maybe 40 percent more? depending on yields (as the die gets bigger yields drop exponentially, which is why the giant 5billion transistor die with embedded esram is a SHIT idea for a low end apu like what the xbox one uses)

the small 1.2billion transistor chip for the cpu would have much better yields, and cost 20 percent or less of the big xbone chip...

add some small costs for the more complex pcb to connect the two..

They have this huge apu that has very little performance and costs a lot to make...
again: poorly designed

I'm sure this isn't what MS or AMD engineers wanted, but if the suits told them to make it work without vram, then this is what they had to do...

I'm sure its a lot more complicated than saying this module is x better at only x% price increase. Besides, 40% is still a pretty huge cost increase.
 
Why were you talking about TV scalers then? The Xbox outputs at 1080p, the Xbox does the scaling. My point remains, only very few people can tell Ryse is not 1080p, the game looks jaw droppingly good and has a very high IQ. Simple.
Because no scaling tech has some magic applied to it that produces non existing information, not even that mega hyper scaler inside Xbone. Whats not there cant be forged. This upscaling shit has to end, because all it does is interpolating and the result is always the same - blurr. Your very high IQ would be working on a native 900p display because that is the rendered resolution, the output res is upscaled with 33% non existing information, ergo no pixel mapping on a 1080p device (only a 900p image streched to fit 1080p) - ergo image quality suffers from this process. Not so simple anymore. Some current gen games have great IQ too, like Gears, God of War 3, or Last of Us... at least on a HD ready, but the IQ on a full HD display is barely "OK".

And I see that Ryse promotional material is straight from PC... oh my.
 
Because no scaling tech has some magic applied to it that produces non existing information. Whats not there cant be forged. This upscaling shit has to end, because all it does is interpolating and the result is always the same - blurr. Your very high IQ would be working on a native 900p display because that is the rendered resolution, the output res is upscaled with 33% non existing information, ergo no pixel mapping on a 1080p device (only a 900p image streched to fit 1080p) - ergo image quality suffers from this process. Not so simple anymore. Some current gen games have great IQ too, like Gears, God of War 3, or Last of Us... at least on a HD ready, but the IQ on a full HD display is barely "OK".

And I see that Ryse promotional material is straight from PC... oh my.

Your on fire tonight.
 
Given that they had to design an architecture around DDR3, I think that it is a very good design. Microsoft's engineers are not stupid. If anything deserves criticism, it's the management that made the requirements (Win8/media multitasking, Kinect, ...) and apparently underestimated the schedule.

I didn't blame the engineers, I blamed managment
The poor design here is going for ddr3, which forces them to use ESRAM, which is embedded on the die and takes up an unholy shitton of die space.
As a die gets bigger, the cost to manufacture a working sample goes up not linearly but exponentially
They can't even bin their faulty samples as a lower end part like amd or nvidia normally would in PC land

Go for an apu (save some money on pcb cost, gain some performance from how memory is handled between cpu and gpu) and opt to use ddr3 instead of more expensive gddr5 unified memory or simply the traditional split pool for cpu and gpu , to save some money
End up paying more because of costly esram and an apu meaning the cpu and gpu share a die , and end up hitting diminishing returns for die space (due to how yields scale) so that there is no room left for a half decent gpu on the die.
That is what happens when the shots are called by beancounters.
 
Could be pre-recorded video cutscenes like Uncharted used in the past.

Seems Very Likely, but then it comes to the point where for comparisons sake it would be more suited to the dark sorcerer comparisons than Killzone comparisons, which does not speak well for the XB1.

Given that they had to design an architecture around DDR3, I think that it is a very good design. Microsoft's engineers are not stupid. If anything deserves criticism, it's the management that made the requirements (Win8/media multitasking, Kinect, ...) and apparently underestimated the schedule.

I couldn't agree more, the XB1 from a Hardware point of view is as good as they could have gotten considering these goals from management:

8gb RAM - in 2010 this meant DDR3 no option
Kinect packed in for everyone - large costs for the Kinect as well the additional hardware in the XB1 such as the sound block.
Quiet operation for watching TV.
Sold at a profit at $499

There's not really much wiggle room there, I also think sony eginners did great but their priorities were different:

Sold at $399 with only a reasonable loss.
No bundle Camera - more room for higher cost.
Powerful and simple.

I think the Hardware engineers at both companies did great.

Also I believe Microsoft were unfortunate regarding Edram, I think there were some issues with x86 licensing from AMD regarding Intel and where it could be produced and the foundries available did not allow for Edram, But that's only a rumor/ supposition.
 
I think it's the whole package. It's all balanced as they say. There isn't a piece that designed for 1080p.

which I find odd as I had a similar power gpu before and it coped with 1080 years ago on my pc, though it used gddr not this ddr / esram solution.

the gpu is 1080, the memory set-up seemingly not (easily anyway as forza shows it can be).
 
Because no scaling tech has some magic applied to it that produces non existing information, not even that mega hyper scaler inside Xbone. Whats not there cant be forged. This upscaling shit has to end, because all it does is interpolating and the result is always the same - blurr. Your very high IQ would be working on a native 900p display because that is the rendered resolution, the output res is upscaled with 33% non existing information, ergo no pixel mapping on a 1080p device (only a 900p image streched to fit 1080p) - ergo image quality suffers from this process. Not so simple anymore. Some current gen games have great IQ too, like Gears, God of War 3, or Last of Us... at least on a HD ready, but the IQ on a full HD display is barely "OK".

And I see that Ryse promotional material is straight from PC... oh my.

Did I miss something?
 
Because no scaling tech has some magic applied to it that produces non existing information. Whats not there cant be forged. This upscaling shit has to end, because all it does is interpolating and the result is always the same - blurr. Your very high IQ would be working on a native 900p display because that is the rendered resolution, the output res is upscaled with 33% non existing information, ergo no pixel mapping on a 1080p device (only a 900p image streched to fit 1080p) - ergo image quality suffers from this process. Not so simple anymore. Some current gen games have great IQ too, like Gears, God of War 3, or Last of Us... at least on a HD ready, but the IQ on a full HD display is barely "OK".

And I see that Ryse promotional material is straight from PC... oh my.

Why do you keep saying this as if I said the opposite? I know exactly how upscaling works and have never suggested anything else. I was just challenging your nonsensical statement about needing a "quad core tv scaler" to not notice the difference.

The IQ on the Ryse trailer is very high, that's what I was referring to, if that's footage that's rendered, upscaled and output all from the Xbox One, then we have nothing to worry about. If it turns out that those scenes were prerendered on PC and then output from the Xbox One, then we have an issue.
 
I'm sure its a lot more complicated than saying this module is x better at only x% price increase. Besides, 40% is still a pretty huge cost increase.

... again, read please, I don't have many more ways to keep rephrasing the same thing

the chip that is 5x more powerful doesn't cost anywhere near 5x as much to make
performance/dollar in production cost for the xbox one apu is very very poor.
It should be way more cost effective

the underlined is what i've been rephrasing 3x for you now, I give up.

The IQ on the Ryse trailer is very high, that's what I was referring to, if that's footage that's rendered, upscaled and output all from the Xbox One, then we have nothing to worry about. If it turns out that those scenes were prerendered on PC and then output from the Xbox One, then we have an issue.
It isn't
that's the point...
 
Did I miss something?

Just an inference from the fact that it looks remarkable clean and the framerate in cutscenes is perfect despite looking better than the gameplay which has dips/tears.

Crytek deny it, but then, they also said that they'd only ever released 900p media even though the early media was undeniably native 1080p.
 
which I find odd as I had a similar power gpu before and it coped with 1080 years ago on my pc, though it used gddr not this ddr / esram solution.

the gpu is 1080, the memory set-up seemingly not (easily anyway as forza shows it can be).

I'm sure the Xbone could run many games from years ago at 1080p as well. A similar GPU (Radeon 7770) also exists in my system and it will certainly not run Battlefield 4 on 1080p. It's not a sensible comparison.
 
The memory architecture is unconventional, as is its use of lots of dedicated silicon.

Are you saying there is going to be a huge learning curve for devs on Xbox One, as there was for Cell and Emotion Engine? I don't see that at all. Improved tools will certainly help, but the memory setup is hardly foreign to developers.
 
Your on fire tonight.

Only when spreading FUD. Upscale is no magic and its not something good either. Its necessary to fit an image on display that has a bigger res. The tech reached its limit and the result is still the same. Worse image quality because a picture has to be "streched". There is nothing good about it, and shouldnt be necessary for games in 1080p in 2013...
The IQ on the Ryse trailer is very high, that's what I was referring to, if that's footage that's rendered, upscaled and output all from the Xbox One, then we have nothing to worry about. If it turns out that those scenes were prerendered on PC and then output from the Xbox One, then we have an issue.

Fair enough. But you act like the Xbone scaler is some wizardy shit that could make up a 31% resolution difference without any loss in quality.
 
PS3 had that split RAM problem and lack of a scaler.

Updated (January) SDK after release offered an API to access the hardware horizontal scaler, which is what is used in games like GT5 and Wipeout HD. So it did have a horizontal-only hardware scaler.
 
I'm sure the Xbone could run many games from years ago at 1080p as well. A similar GPU (Radeon 7770) also exists in my system and it will certainly not run Battlefield 4 on 1080p. It's not a sensible comparison.

I don't know, that same card would have run it in 1080, but at a lower setting of course.

certainly is a shame they didn't go for something closer to a capable current gpu though it must be said. a 760 isn't that expensive for instance, though I guess it'd bump the price of these boxes even further.
 
I haven't really been swept into the controversy surrounding the Xbone and its CoD: Ghosts/TitanFall-are-rendered-at-720p negative buzz, because I figured hey, these are launch games, we'll see future titles upped to 1080p as developers become more attuned with the hardware.

But you're telling me that 720-900p will be the standard for Xbone games throughout the entire gen?! Woooooooow. That's not good at all.
 
So, this thread teached me a lot of things.

1.) Albeit consoles being a technical medium, tech doesn't matter. Games do however and albeit consoles being a technical medium, the perceived quality of a game is not at related to it's technical presentation. At all.

2.) Resolution doesn't matter, because the difference between native and non native resolutions is only visible to 0,0001% of "people".

3.) "People" will buy anything anytime, so quality standards are not necessary, because even for us Enthusiasts everything is fine as long as "people" will buy. And they will.

4.) 720p in 2018 will still look great.

5.) We don't even need to start thinking about 60FPS as a standard for the coming years. It doesn't matter anyway because see 1.)

6.) Dead Rising 3, Killer Instinct 720p, Forza 4 HD and Lair would receive just as much hype and would sell as good if they were on 360, because it's the games. "People" would be fine to stay with 360 anyway, because they don't notice technical differences.

7.) If "people" are happy, (XBOX-)GAF is happy.

Wow, thanks. That's great.

we don't need to start thinking about 60fps as a standard because it's unnecessary and provides drawbacks in other areas. Until we have true dynamic photorealism across all types of scenes there will always be a tradeoff.
 
I haven't really been swept into the controversy surrounding the Xbone and its CoD: Ghosts/TitanFall-are-rendered-at-720p negative buzz, because I figured hey, these are launch games, we'll see future titles upped to 1080p as developers become more attuned with the hardware.

But you're telling me that 720-900p will be the standard for Xbone games throughout the entire gen?! Woooooooow. That's not good at all.

It's kind of silly on MS's behalf, really. It's got headroom at 720p but not enough power for 1080p. 900p would be fine if displays could scale to any resolution natively, but they can't.
 
Only when spreading FUD. Upscale is no magic and its not something good either. Its necessary to fit an image on display that has a bigger res. The tech reached its limit and the result is still the same. Worse image quality because a picture has to be "streched". There is nothing good about it, and shouldnt be necessary for games in 1080p in 2013...


Fair enough. But you act like the Xbone scaler is some wizardy shit that could make up a 30% resolution difference without any loss in quality.

I never said anything like that, you just went off on one for some reason, I didn't even mention scalers, you did. All I said is that most people wouldn't be able to tell that Ryse was upscaled from 900p if the graphics looks at good as they do in the trailer.
 
Wow...back to Durango/Orbis days, who would have expected that X1 would struggle to output games at 1080p?

Scary thing is time ability to lessen the shock ! I imagine if MS released X1 with Dreamcast specs inside it, it would be OK by now.

Good thing is that PS4 50% power advantage started to show this early...I mean 900p vs 1080p is already 40% ....720p vs 1080p is 225% ...not factoring other differences in textures and framerates.

Next gen will be exciting.
 
Every time a game is revealed on xbone, the first question on everyone's lips will be, at what resolution? It doesn't matter if it's noticable or not, it's another negative connotation associated with the brand...
 
... again, read please, I don't have many more ways to keep rephrasing the same thing

the chip that is 5x more powerful doesn't cost anywhere near 5x as much to make
performance/dollar in production cost for the xbox one apu is very very poor.
It should be way more cost effective

the underlined is what i've been rephrasing 3x for you now, I give up.

Its my fault, I only skimmed through your posts, I apologise for that. I still think using your comparison is a bit naive as you'd need to make a more direct comparison for such a specific piece of hardware, not denying that you may possibly have a point though.
 
Good thing is that PS4 50% power advantage started to show this early...I mean 900p vs 1080p is already 40% ....720p vs 1080p is 225% ...not factoring other differences in textures and framerates.

Precisely why Xbone's design is so silly. 720p on Xbone vs. 1080p on PS4 is massively underutilising the Xbone, but I suspect it'll be commonplace because of eSRAM.

If MS were sensible they'd set up first party studios and an equivalent of the ICE team (perhaps at 343i), because it would at least minimise the gap for first party games. But looking at stuff like Forza and Fable for Xbone, they're apparently happy for their teams to carry on half-arsing things.
 
Are you saying there is going to be a huge learning curve for devs on Xbox One, as there was for Cell and Emotion Engine? I don't see that at all. Improved tools will certainly help, but the memory setup is hardly foreign to developers.
I didn't compare it to PS2 or PS3, I compared it to PS4. And there's a bigger learning curve compared to that system.

Exploiting the embedded memory correctly is not trivial. Just look at Wii U, which uses a similar memory architecture. Even competent developers like Shin'en only realized how to use the embedded RAM and what it can bring to the table after shipping their first game, which required fundamental changes to their engine.
 
Its my fault, I only skimmed through your posts, I apologise for that. I still think using your comparison is a bit naive as you'd need to make a more direct comparison for such a specific piece of hardware, not denying that you may possibly have a point though.

If you want a direct comparison compare it to ps4
same architecture, uses an apu as well, chose a different memory configuration, has more of it's transistor budget dedicated to performance, costs less to make despite faster, better, more expensive memory.

I didn't want to compare to ps4 because that then gets interpreted as console wars and people then stick their fingers in their ears.
 
Would MS have even been better off with discrete CPU and GPU, and split memory pools, say 3GB main RAM and 5GB GDDR5?

I think they just went for an APU to try and get some longer term cost reductions (although with process shrinks slowing down, I'm not sure how much of a saving they expect over 5 years). I think both MS and Sony might have been surprised how expensive the current consoles still are to produce.

They could have got better performance for similar silicon budget at launch with a discrete GPU or they could have gone with EDRAM on a daughter die like the 360, but those again don't scale down as easily as an APU.

shortsighted? easy to say in hindsight I guess.
 
ps4-dual-architecture.jpg


OP reminded me of that. Hopefully devs will get a handle on the new Xbox soon. It took a long while with the PS3 and in a lot of cases 3rd party support never fully recovered. At least the X1 doesn't seem as confusing as the PS3 was to program for.
 
I think they just went for an APU to try and get some longer term cost reductions (although with process shrinks slowing down, I'm not sure how much of a saving they expect over 5 years). I think both MS and Sony might have been surprised how expensive the current consoles still are to produce.

They could have got better performance for similar silicon budget at launch with a discrete GPU or they could have gone with EDRAM on a daughter die like the 360, but those again don't scale down as easily as an APU.

shortsighted? easy to say in hindsight I guess.

All of this

We'll see how much the difference in PCB/memory bus cost actually matters during the lifecycle.
Right now they're both drawing the short end of the stick with choosing to go for an APU
 
ps4-dual-architecture.jpg


OP reminded me of that. Hopefully devs will get a handle on the new Xbox soon. It took a long while with the PS3 and in a lot of cases 3rd party support never fully recovered. At least the X1 doesn't seem as confusing as the PS3 was to program for.
The thing on the right isn't an XBO, it's a considered, and discarded, PS4 architecture.
 
Precisely why Xbone's design is so silly. 720p on Xbone vs. 1080p on PS4 is massively underutilising the Xbone, but I suspect it'll be commonplace because of eSRAM.

If MS were sensible they'd set up first party studios and an equivalent of the ICE team (perhaps at 343i), because it would at least minimise the gap for first party games. But looking at stuff like Forza and Fable for Xbone, they're apparently happy for their teams to carry on half-arsing things.

If Xbone engineers were real gamers, they would immediately refuse designing X1 as it is now. MS should've implemented Gddr5 inside the APU or opted to more conventional PowerPC + Powerful GPU approach....but they cared about shoving media apps and advertising more than anything else.
 
If Xbone engineers were real gamers, they would immediately refuse designing X1 as it is now. MS should've implemented Gddr5 inside the APU or opted to more conventional PowerPC + Powerful GPU approach....but they cared about shoving media apps and advertising more than anything else.

What a silly thing to say. Engineers have to design within constraints, they're not free to do whatever they want.
 
The thing on the right isn't an XBO, it's a considered, and discarded, PS4 architecture.

I know, its just the OP reminded me of that part of the Cerny presentation. I remember whilst watching that part of it I wished he had chosen the option on the right. I kind of feel otherwise now.
 
If Xbone engineers were real gamers, they would immediately refuse designing X1 as it is now. MS should've implemented Gddr5 inside the APU or opted to more conventional PowerPC + Powerful GPU approach....but they cared about shoving media apps and advertising more than anything else.

mj-laughing.gif
 
Top Bottom