• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

X1 DDR3 RAM vs PS4 GDDR5 RAM: “Both Are Sufficient for Realistic Lighting”(Geomerics)

avaya

Member
That's almost precisely what developers will be doing. At its base it helps with bandwidth, but with better, more effective use of the ESRAM, its latency will end up being cleverly utilized for maximum benefit to games being developed for the system.

How is the exposed nerve?
 

TheKayle

Banned
Bro, everyone knows you are the General in the Xbox defense army. Please, just stop. You are embarrassing yourself.

that memory system configuration is done to fix gpu stalls and other things about this there are no doubts...we will see if fixing texture trashing and stalls how much will close the gap that there is
 
Haha, this is even getting ridiculed by guys from Sony Santa Monica

Jam6hAP.jpg

ICE guys are having fun with it too.

Dan Olson (@olson_dan)
7/13/13, 4:38 AM
Uh oh this sounds bad for PS4... ... neogaf.com/forum/showpost…
Adrian Bentley (@adrianb3000)
7/13/13, 11:05 AM
@olson_dan @TobiasBerghoff Lol. Luckily we've developed software to optimize out hardware boolean usage. Shhh don't tell anyone. ;)
Cort (@postgoodism)
7/13/13, 12:08 PM
@adrianb3000 @olson_dan @TobiasBerghoff Way to spoil our big TGS announcement! Good work, Bentley.
Tobias Berghoff (@TobiasBerghoff)
7/13/13, 12:17 PM
@postgoodism @adrianb3000 @olson_dan At least he didn't mention that TLG is now a text adventure with highly tessellated strings.
 

TheCloser

Banned
that memory system configuration is done to fix gpu stalls and other things about this there are no doubts...we will see if fixing texture trashing and stalls how much will close the gap that there is

I'm not denying that but at this point, its just grasping at straws. Its not going to all of a sudden make the xbox one more powerful than what it is. Its just a means to best use the power it has. When an xbox one dev tells you that the ps4 is more powerful and debate is only by how much, its time to stop talking. If you go look into his post history, he has been in defense of the xbox one since day 1. At this point, its just hilarious and embarrassing. I'm not saying that you have no right to like the xbox one but like it because its what you want and be happy with it the way it is.
 
Explain the bolded excerpt because it seems to me like your just trying to glorify latency into something it's not.

I'm doing nothing of the sort. Actual game developers, even a Sony first party dev, have suggested that really low latency would make a real difference in the performance and efficiency of the Xbox 1 hardware. That's not fictional, it's fact.

http://beyond3d.com/showpost.php?p=1696970&postcount=245

No the benefit of the EDRAM in 360 was moving all of the GPU's output bandwidth to a separate pool of memory, with enough bandwidth for it to never be a bottleneck.
The SRAM performs a similar function, and potentially more.
If the pool is actually SRAM as opposed to some EDRAM variant, then it would have very low latency, this would mean that using it as a data source would greatly increase GPU performance in memory limited cases.
If it really is SRAM that memory is a big block on the die, and 64MB was probably not practical.

http://beyond3d.com/showpost.php?p=1696796&postcount=144

--------------------------------------------------------------------------------

I should have said earlier if it's really SDRAM, and therefore very low latency, having something to block copy data to it makes a lot of sense.

If the you know the GPU will be reading from a buffer in a cache unfriendly way, moving that buffer to the low latency memory would dramatically improve the utilization of the CU's. In some cases more so than having significantly more CU's.

http://beyond3d.com/showpost.php?p=1697305&postcount=401

they might let software better exploit the 32MB scratch pad.
If that Scratchpad is low latency then it could make a large difference to the efficiency of the system.

IMO and from what I've been told, most shaders spend more time waiting on memory than they do computing values, if that pool of memory is similar to L2 Cache performance you'd be looking at a cache miss dropping from 300+ GPU cycles to 10-20. Hiding 10-20 cycles is a lot easier than hiding 300.

IF the ESRAM pool is low latency then I think the Durango architecture is interesting.


You also have extremely knowledgeable individuals such as Dave Baumann saying things like this in reference to the XB1 GPU being compared to a Radeon 7770GHZ Edition and a Radeon 7790.

http://beyond3d.com/showpost.php?p=1762942&postcount=4633

I would wager, that when the ESRAM is used effectively, the performance of the Xbox One's graphics subsystem will far an away outstrip any of those discrete parts you mention.

How many people on here have the expertise to say Dave Baumann doesn't have any idea what the hell he's talking about? How many people on here have the expertise to say that an actual game developer has no idea what they are talking about? Really low latency memory potentially providing a serious benefit to XB1 graphics performance isn't made up. I even know firsthand from a development source that the ESRAM's latency is indeed no insignificant part of the XB1's graphics performance. In fact, that development source makes ESRAM sound pretty damn significant to performance. The reason why I think so many have so much trouble accepting this is because they look at everything through the lens of console wars. Whenever they hear anything at all that might make the XB1 sound a like a system that has some interesting design quirks or upsides, people immediately get defensive and somehow react as if this is an attempt to undermine the technical superiority of the PS4.
 

RoboPlato

I'd be in the dick
Oh good, we're back to talking about the magical benefits of low latency. Bandwidth>>>>>latency, especially when it comes to gaming. Every new type of RAM trades latency for bandwidth, we don't use DDR1 any more despite it being so low latency.

ICE guys are having fun with it too.
DICE has been making fun of it as well.
 

i-Lo

Member
Oh good, we're back to talking about the magical benefits of low latency. Bandwidth>>>>>latency, especially when it comes to gaming. Every new type of RAM trades latency for bandwidth, we don't use DDR1 any more despite it being so low latency.

And you wonder why people cling to rumours of 12GB well past unveiling.
 

Espada

Member
Oh good, we're back to talking about the magical benefits of low latency. Bandwidth>>>>>latency, especially when it comes to gaming. Every new type of RAM trades latency for bandwidth, we don't use DDR1 any more despite it being so low latency.


DICE has been making fun of it as well.

I wonder how many people are going to avoid moving up to DDR4 memory because of the increased latency it has over DDR3. It's crazy how people are using this is a bogeyman, or potential special sauce.
 

mavs

Member
I wonder how many people are going to avoid moving up to DDR4 memory because of the increased latency it has over DDR3. It's crazy how people are using this is a bogeyman, or potential special sauce.

This was actually a legitimate problem going from DDR2 to DDR3. The frequency boost was significant, but the increased latency meant top-end DDR2 kits matched early DDR3 in synthetic bandwidth tests, and DDR2 was equal or better in benchmarks.

That only lasted about a year though before we started getting higher frequency kits.
 

badb0y

Member
I'm doing nothing of the sort. Actual game developers, even a Sony first party dev, have suggested that really low latency would make a real difference in the performance and efficiency of the Xbox 1 hardware. That's not fictional, it's fact.

http://beyond3d.com/showpost.php?p=1696970&postcount=245



http://beyond3d.com/showpost.php?p=1696796&postcount=144



http://beyond3d.com/showpost.php?p=1697305&postcount=401




You also have extremely knowledgeable individuals such as Dave Baumann saying things like this in reference to the XB1 GPU being compared to a Radeon 7770GHZ Edition and a Radeon 7790.

http://beyond3d.com/showpost.php?p=1762942&postcount=4633



How many people on here have the expertise to say Dave Baumann doesn't have any idea what the hell he's talking about? How many people on here have the expertise to say that an actual game developer has no idea what they are talking about? Really low latency memory potentially providing a serious benefit to XB1 graphics performance isn't made up. I even know firsthand from a development source that the ESRAM's latency is indeed no insignificant part of the XB1's graphics performance. In fact, that development source makes ESRAM sound pretty damn significant to performance. The reason why I think so many have so much trouble accepting this is because they look at everything through the lens of console wars. Whenever they hear anything at all that might make the XB1 sound a like a system that has some interesting design quirks or upsides, people immediately get defensive and somehow react as if this is an attempt to undermine the technical superiority of the PS4.
This is why all your arguments falter because you keep saying low-latency memory would seriously benefit graphics performance which is wrong. If that were the case highend parts like HD 7970/GTX 680/Titan etc. would come with some sort of memory subsystem to help assist with latency but they don't. The reason is GPUs architectures are highly paralleled and so higher latency has little no affect on performance.

The idea of having a embedded memory isn't unique, Dirk Meyer actually thought about using it in APUs to help supplement bandwidth because the biggest problem with AMD APUs was the lack of bandwidth which would bottleneck performance of the GPUs not latency. Guess what? AMD's future APUs will be compatible with GDDR5 memory...hmm I wonder why that is? Obviously the ESRAM + DDR3 configuration can unlock some secret performance we have been overlooking!

Dave Baumann is talking about discrete parts in the quote you are using...we are not talking about discrete parts here both PS4 and Xbox One use highly customized parts that have desktop counterparts. All the other links are talking in hypotheticals with zero backup.
 
You also have extremely knowledgeable individuals such as Dave Baumann saying things like this in reference to the XB1 GPU being compared to a Radeon 7770GHZ Edition and a Radeon 7790.

http://beyond3d.com/showpost.php?p=1762942&postcount=4633



How many people on here have the expertise to say Dave Baumann doesn't have any idea what the hell he's talking about? How many people on here have the expertise to say that an actual game developer has no idea what they are talking about? Really low latency memory potentially providing a serious benefit to XB1 graphics performance isn't made up. I even know firsthand from a development source that the ESRAM's latency is indeed no insignificant part of the XB1's graphics performance. In fact, that development source makes ESRAM sound pretty damn significant to performance. The reason why I think so many have so much trouble accepting this is because they look at everything through the lens of console wars. Whenever they hear anything at all that might make the XB1 sound a like a system that has some interesting design quirks or upsides, people immediately get defensive and somehow react as if this is an attempt to undermine the technical superiority of the PS4.
Please explain how Dave Bauman is refering to the latency of the ESRAM in that post, and not the additional bandwidth (in addition to the DDR3), which seems far more likely given the low bandwidth of the graphics card being compared.
 

TheD

The Detective
Oh good, we're back to talking about the magical benefits of low latency. Bandwidth>>>>>latency, especially when it comes to gaming. Every new type of RAM trades latency for bandwidth, we don't use DDR1 any more despite it being so low latency.


DICE has been making fun of it as well.

No, low latency is important......... for high clocked CPUs.

Because the consoles don't have high clocked CPUs and because GPUs need a lot of bandwidth (and do not really care much about latency), is the reason that high bandwidth RAM is preferable.
 
This is why all your arguments falter because you keep saying low-latency memory would seriously benefit graphics performance which is wrong. If that were the case highend parts like HD 7970/GTX 680/Titan etc. would come with some sort of memory subsystem to help assist with latency but they don't. The reason is GPUs architectures are highly paralleled and so higher latency has little no affect on performance.

The idea of having a embedded memory isn't unique, Dirk Meyer actually thought about using it in APUs to help supplement bandwidth because the biggest problem with AMD APUs was the lack of bandwidth which would bottleneck performance of the GPUs not latency. Guess what? AMD's future APUs will be compatible with GDDR5 memory...hmm I wonder why that is? Obviously the ESRAM + DDR3 configuration can unlock some secret performance we have been overlooking!

Dave Baumann is talking about discrete parts in the quote you are using...we are not talking about discrete parts here both PS4 and Xbox One use highly customized parts that have desktop counterparts. All the other links are talking in hypotheticals with zero backup.

All the other links are from a first party sony developer.... Your argument is essentially they haven't done this on high end cards on PCs, so there must be no benefit to the design at all. This level of ignorance is amazing considering what a much more limited 10MB of EDRAM meant to the 360's graphics performance. Was EDRAM a common occurrence in high end pc GPUs when the 360 did it and benefitted from it to such great extent?

GPUs are very tolerant of high latency, but that in no way means they wouldn't benefit greatly from really low latency memory for when there is a cache miss on the GPU. You're not really arguing with my comments basically, you're arguing with the statements of a highly experienced sony first party dev and Dave Baumann, who has a level of professional experience with AMD GPUs in a way that few do. Do you somehow believe that any statement coming from you or any other random poster is somehow less hypothetical than comments coming from such experienced individuals? The backup for their statements comes from their professional experience of actually designing AAA videogames games and helping design high quality GPU hardware. Where does your backup come from?
 
ERP is a Sony dev so if he says differences in visuals between consoles will be small, then they really probably will be small.

I guess...the ESRAM is a huge deal. It sounds so unreal though, that 32 mb of ESRAM would offset so many and large differences. Then again, I don't know what small is for ERP.
 
ERP is a Sony dev so if he says differences in visuals between consoles will be small, then they really probably will be small.

Where did he say that? Can't find it in the links above.

I guess...the ESRAM is a huge deal. It sounds so unreal though, that 32 mb of ESRAM would offset so many and large differences. Then again, I don't know what small is for ERP.

eSRAM might be increasing the performance of the X1 GPU, and that's why Dave Baumann's statement, that the X1 GPU will surpass comparable PC GPUs like the 7770 or the 7790 is most likely true. But the PS4 GPU will also feature customizations and a massive bandwidth, which means that it will surpass comparable PC GPUs like the 7850 or the 7870, too! And of course this is something that SenjutsuSage does not mention...
 
ERP is a Sony dev so if he says differences in visuals between consoles will be small, then they really probably will be small.

I guess...the ESRAM is a huge deal. It sounds so unreal though, that 32 mb of ESRAM would offset so many and large differences. Then again, I don't know what small is for ERP.

I remember him saying he works for Sony but he was not a dev but i could be wrong .
 

Perkel

Banned
ERP is a Sony dev so if he says differences in visuals between consoles will be small, then they really probably will be small.

I guess...the ESRAM is a huge deal. It sounds so unreal though, that 32 mb of ESRAM would offset so many and large differences. Then again, I don't know what small is for ERP.

This is what ERP wrote:

ERP said:

GPUs are not hindered by latency due to parallelism. What he mentioned is that some things may work better with ESRAM not that whole GPU will have exceptional efficiency.
 
I'm doing nothing of the sort. Actual game developers, even a Sony first party dev, have suggested that really low latency would make a real difference in the performance and efficiency of the Xbox 1 hardware. That's not fictional, it's fact.

http://beyond3d.com/showpost.php?p=1696970&postcount=245

Here's the problem you, and all the other Xbox One dreamers continue to ignore: "If the pool is actually SRAM as opposed to some EDRAM variant, then it would have very low latency, this would mean that using it as a data source would greatly increase GPU performance in memory limited cases."

Yes, there will be cases where low latency ESRAM will provide benefits. But guess what, there are going to be lots of situations where it won't. At different times a game is going to be shader limited or bandwidth limited or fill limited or limited by the amount of RAM available. So if Xbox One has an advantage in a single corner case, and PS4 is dramatically faster in literally every single other situation, the net result is still the Xbox One being significantly slower. I know it's easy to fixate on one thing when it makes your prefered platform look better, but 32MB of low latency ESRAM on the Xbox One cannot catapult it past the PS4, not even close, unless your entire game fits in that much memory. And even then it's still probably a toss up.

Realistically it can't even close the paper gap by more than a few percent, and then only if you are aggressively optimizing for this advantage instead of doing the obvious and easy thing and just writing most of your buffers to ESRAM to avoid saturating the DDR3 bus.
 

AlphaDump

Gold Member
I remember him saying he works for Sony but he was not a dev but i could be wrong .

here is his blog if you follow his public profile on beyond3d:

http://www.blogger.com/profile/00814509809962646468


Gender Male
Industry Engineering
Occupation Game Developer
Location Redmond, Washington, United States
Introduction I've been writing games for over 20 years, doing everything from Gameplay to Graphics and Physics, currently I'm working for SCEA.

but on

Giantbomb it says

Rob has worked for - Westwood Studios - Boss Game Studios (as Technical Director) - Maxis Software / EA - Microsoft

http://www.giantbomb.com/rob-povey/3040-5074/

redmond, washington is an interesting location, but who knows.

his twitter also says he put in his two weeks at sony.
 
I remember him saying he works for Sony but he was not a dev but i could be wrong .

He's a first party Sony dev, and apart of an incredible studio that has made amazing games. At least from what I remember he is. Anything can change and people can move on, but I assume nothing has.

Here's the problem you, and all the other Xbox One dreamers continue to ignore: "If the pool is actually SRAM as opposed to some EDRAM variant, then it would have very low latency, this would mean that using it as a data source would greatly increase GPU performance in memory limited cases."

Yes, there will be cases where low latency ESRAM will provide benefits. But guess what, there are going to be lots of situations where it won't. At different times a game is going to be shader limited or bandwidth limited or fill limited or limited by the amount of RAM available. So if Xbox One has an advantage in a single corner case, and PS4 is dramatically faster in literally every single other situation, the net result is still the Xbox One being significantly slower. I know it's easy to fixate on one thing when it makes your prefered platform look better, but 32MB of low latency ESRAM on the Xbox One cannot catapult it past the PS4, not even close, unless your entire game fits in that much memory. And even then it's still probably a toss up.

Realistically it can't even close the paper gap by more than a few percent, and then only if you are aggressively optimizing for this advantage instead of doing the obvious and easy thing and just writing most of your buffers to ESRAM to avoid saturating the DDR3 bus.

You point to memory limited cases, but ERP also states this regarding shaders.

http://beyond3d.com/showpost.php?p=1697305&postcount=401

IMO and from what I've been told, most shaders spend more time waiting on memory than they do computing values, if that pool of memory is similar to L2 Cache performance you'd be looking at a cache miss dropping from 300+ GPU cycles to 10-20. Hiding 10-20 cycles is a lot easier than hiding 300.

IF the ESRAM pool is low latency then I think the Durango architecture is interesting.

So according to a confirmed dev, shaders spend more time waiting on memory than they do computing values. So the situations in which a very low latency piece of memory might help performance and efficiency might not be nearly as limited in its use as you think. Also, nobody is even remotely suggesting that the low latency benefits of the ESRAM is a one size fits all solution to any and all performance concerns, but if it can take some commonly expensive tasks and make them relatively cheap on XB1 hardware, that automatically benefits other aspects of a game, because now the developer has more power free to dedicate to other aspects of their games.
 

R3TRODYCE

Member
Totally unrelated but do you gents remember those Smash my(insert console name) websites? I wonder if someone is going to make ones dedicated to next gen consoles.
 

FINALBOSS

Banned
Oh yea, it must piss you off that other people get to have a say on this site, doesn't it?

No, I welcome it.

But your posts are so over-the-top in dreamland and have been proven inaccurate and misleading that I'm surprised you're still at it or not banned yet.

You've quoted that Dave post more than anyone else on the internet combined.
 

Perkel

Banned
Totally unrelated but do you gents remember those Smash my(insert console name) websites? I wonder if someone is going to make ones dedicated to next gen consoles.

Dumb people will be doing dumb things. Amount of dumb people is unrelated to year they live so yes. There will be smashed consoles.
 

TheCloser

Banned
He's a first party Sony dev, and apart of an incredible studio that has made amazing games. At least from what I remember he is. Anything can change and people can move on, but I assume nothing has.



You point to memory limited cases, but ERP also states this regarding shaders.

http://beyond3d.com/showpost.php?p=1697305&postcount=401



So according to a confirmed dev, shaders spend more time waiting on memory than they do computing values. So the situations in which a very low latency piece of memory might help performance and efficiency might not be nearly as limited in its use as you think. Also, nobody is even remotely suggesting that the low latency benefits of the ESRAM is a one size fits all solution to any and all performance concerns, but if it can take some commonly expensive tasks and make them relatively cheap on XB1 hardware, that automatically benefits other aspects of a game, because now the developer has more power free to dedicate to other aspects of their games.

At this point, i just want to know how much microsoft is paying you.
 

PerZona

Member
Will we see a difference in fps (or performance wise) for multiplatform games when comparing the PS4 to the XB1 because of the RAM difference? And how big of a difference?
 
Second I see a SenjutsuSage post my eyes just go glossy and in my mind I say "oh brother"
As soon as I see his posts I just skip to the inevitable replies refuting almost everything he says. Which is a shame, because he probably has some interesting things hidden in there somewhere. But everyone knows the story of the boy who cried wolf...
 

FINALBOSS

Banned
As soon as I see his posts I just skip to the inevitable replies refuting almost everything he says. Which is a shame, because he probably has some interesting things hidden in there somewhere. But everyone knows the story of the boy who cried wolf...

That's kinda the thing though...he doesn't.

His shit's been proven wrong so many times it'll make your headspin. He literally links to B3D (Microsoft fanboy haven) all day long and treats their "insiders" (not verified and most certainly not insiders) words like gospel.
 

FINALBOSS

Banned
It shouldnt take a year. This is not the PS3. Devs do not need to figure it out. They already know how to code for this hardware. Especially in the case of the PS4.

It's not about coding for the hardware. It's about having mature development tools. My year time-frame was just like end-game wise.

And notice how Senjutsu is now gone? Drive-by postings in these kinds of threads filled with non-sense.

Although I'm sure he'll read this now and come back.
 

Perkel

Banned
At this point, i just want to know how much microsoft is paying you.

As soon as I see his posts I just skip to the inevitable replies refuting almost everything he says. Which is a shame, because he probably has some interesting things hidden in there somewhere. But everyone knows the story of the boy who cried wolf...

I think posting crapthrowing posts in technical thread is far more damaging to thread than any thing he posted. At least he reposts things written by proper devs.

You are on other hand throwing crap being juniors contributing 0 to discussion. So either post something worthwhile to read or read and be silent.
 

USC-fan

Banned
He's a first party Sony dev, and apart of an incredible studio that has made amazing games. At least from what I remember he is. Anything can change and people can move on, but I assume nothing has.



You point to memory limited cases, but ERP also states this regarding shaders.

http://beyond3d.com/showpost.php?p=1697305&postcount=401



So according to a confirmed dev, shaders spend more time waiting on memory than they do computing values. So the situations in which a very low latency piece of memory might help performance and efficiency might not be nearly as limited in its use as you think. Also, nobody is even remotely suggesting that the low latency benefits of the ESRAM is a one size fits all solution to any and all performance concerns, but if it can take some commonly expensive tasks and make them relatively cheap on XB1 hardware, that automatically benefits other aspects of a game, because now the developer has more power free to dedicate to other aspects of their games.
Erp Is not a sony first party dev.

Everyone agrees esram helps. Same reason the wiiu uses 32MB of edram. That is 0.4% of the ram in next gen system. Sucks it so small. Very limiting and devs are already having problem using the esram to make up the difference.
 

FINALBOSS

Banned
Erp Is not a sony first party dev.

Everyone agrees esram helps. Same reason the wiiu uses 32MB of edram. That is 0.4% of the ram in next gen system. Sucks it so small. Very limiting and devs are already having problem using the esram to make up the difference.

It CAN help. It requires extra effort to utilize it properly. Cerny went over this where he stated they had a solution that used it but decided, due to ease of use, to go with their unified solution.
 

badb0y

Member
All the other links are from a first party sony developer.... Your argument is essentially they haven't done this on high end cards on PCs, so there must be no benefit to the design at all. This level of ignorance is amazing considering what a much more limited 10MB of EDRAM meant to the 360's graphics performance. Was EDRAM a common occurrence in high end pc GPUs when the 360 did it and benefitted from it to such great extent?

GPUs are very tolerant of high latency, but that in no way means they wouldn't benefit greatly from really low latency memory for when there is a cache miss on the GPU. You're not really arguing with my comments basically, you're arguing with the statements of a highly experienced sony first party dev and Dave Baumann, who has a level of professional experience with AMD GPUs in a way that few do. Do you somehow believe that any statement coming from you or any other random poster is somehow less hypothetical than comments coming from such experienced individuals? The backup for their statements comes from their professional experience of actually designing AAA videogames games and helping design high quality GPU hardware. Where does your backup come from?
First, we have to stop talking about the PS4 and Xbox One as anything other than specialized PCs. They same the same architecture as a regular old PC so they have the same advantages and disadvantages that PCs have. The fact that a lot of Xbox fans keep bringing up latency as a source of performance is a bit disturbing because if that was the case we would have the PC GPUs running on lower latency VRAM with a supplemental SRAM on die (Just like the Xbox One) but we don't and the only time embedded RAM was considered was when we need more bandwidth. Let me repeat myself again, lower latency will not extrapolate more performance from a GPU then having higher bandwidth, anyone that tells you otherwise is pissing on 30+ years of computer development.

Second and perhaps the most idiotic thing I have ever read on these forums is that somehow lower latency is better than having more CUs. This statement is so asinine I am not even sure how the author reached this conclusion. How is latency, something related to moving data around, supposed to boost performance over having raw GPU power?

Again I am not trying to antagonize you, just trying to engage in discourse.
It CAN help. It requires extra effort to utilize it properly. Cerny went over this where he stated they had a solution that used it but decided, due to ease of use, to go with their unified solution.

Absolutely, going with ESRAM becomes a necessity if you use DDR3 or slow GDDR5 as the system memory because it helps make up the bandwidth loss, I don't think anyone is arguing that here.
 

TheCloser

Banned
I think posting crapthrowing posts in technical thread is far more damaging to thread than any thing he posted. At least he reposts things written by proper devs.

You are on other hand throwing crap being juniors contributing 0 to discussion. So either posts something worthwhile to read or read and be silent.

Maybe you should actually read the whole thread and not one post to see what has been posted by myself and the other guy. You are not posting anything of value yourself so you should probably read in silence. You where once a junior youself so step down from that high horse.
 

astraycat

Member
You point to memory limited cases, but ERP also states this regarding shaders.

http://beyond3d.com/showpost.php?p=1697305&postcount=401



So according to a confirmed dev, shaders spend more time waiting on memory than they do computing values. So the situations in which a very low latency piece of memory might help performance and efficiency might not be nearly as limited in its use as you think. Also, nobody is even remotely suggesting that the low latency benefits of the ESRAM is a one size fits all solution to any and all performance concerns, but if it can take some commonly expensive tasks and make them relatively cheap on XB1 hardware, that automatically benefits other aspects of a game, because now the developer has more power free to dedicate to other aspects of their games.

ERP must be mistaken on the cycle count. Unless AMD has miraculously managed to reduce cache latency by an order of magnitude from NI cards, there's no way that L2 is 10-20 cycles. AMD NI cards are ~300+ cycles to L1 alone. Missing to main memory is ~500+ cycles. Just a 30% reduction in cycle count from the previous generation would be incredible, but an over 90% reduction? That would be miraculous.
 

Lynn616

Member
It's not about coding for the hardware. It's about having mature development tools. My year time-frame was just like end-game wise.

And notice how Senjutsu is now gone? Drive-by postings in these kinds of threads filled with non-sense.

Although I'm sure he'll read this now and come back.

Avalanche Studio has already stated that PS4 development tools are more mature than the X1. There is no reason we will not see a difference in multi platform games at launch.
 
Top Bottom