• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

150Mhz CPU boost on XBO, now in production

Status
Not open for further replies.

Vizzeh

Banned
I'd be inclined to believe that Panello maybe hasn't heard of them, given he is a marketing man. But then with him chiming in on the technical nature of both consoles I guess he should probably know which devs have worked for his company.

I think Albert Penello is a little more than marketing, I think he was involved in the building of the console so has a little bit more technical knowledge than the typical PR guys.


Albert Penello thought that Europe is a country. I don't think you need any more explanation.

wow.. I get sometimes people the otherside of the pond do not get Europe and its geography but was a little bit surprised he wouldnt, even If you ignore he doesnt know Europe isnt 1 country, how can we even fathom the idea that MS dont know this?

#geographyfail.

I stand corrected. I'll make sure the internal teams are aware of this too.
 

astraycat

Member
No way are we dealing with 400 cycles with the ESRAM right there on chip. Anything is slow in comparison to the cache, but I think you're horribly exaggerating how quickly GCN is equipped to get the necessary information back in there to handle the GPGPU task. This takes me back to an earlier point. Simply because the PS4 has these extra optimizations, you're almost essentially discrediting GCN architecture to the point that you make it seem like it simply isn't effective at doing precisely what it was designed to do in the first place, which is handle GPGPU tasks alongside graphics related tasks, and do so effectively.

Now, we may not agree on the minor details, but I simply cannot agree with you so strongly disregarding the ability of AMD's GCN architecture to accomplish the very thing it was designed to do. With some of the things you've said, and since many PC GPU, well, all, do not have PS4 type optimizations, you're beyond the point where you're underestimating the GPGPU capabilities of the Xbox One architecture, you're effectively now underestimating and indirectly labeling as flawed AMD"s entire line of GCN class GPUs. It may not handle it as effectively as the PS4 will, but the Xbox One can most definitely handle GPGPU. And GCN hardware that exists right now on the PC can handle GPGPU, or compute tasks such as TressFX just fine. GCN was designed to handle these things, as was the Xbox One, even if not to the extent that the PS4 was. And one of Microsoft's key takeaways to the developers they briefed on the Xbox One architecture was to start looking at GPGPU. That's not something you say to developers if your hardware is somehow not very well equipped to handle it, or I would hope not.

Just my 2 cents. But this chat was very informative, and while I'll be heading off to bed. We'll def continue this tomorrow. Good chatting with you, dude. And night everyone!

400 cycles doesn't sound all that outrageous. Consider, L1 is 20x the cycles of LDS (according to the GCN slides presented at GDC Europe), and LDS is in the 10s of cycles for access, which means L1 is in at least the 200s of cycles. If L2 latency is even just 2x that of L1, then it will be 400 cycles long. Thus, if ESRAM is behind the L1/L2 cache hierarchy, then on a full miss it will at least incur the L2 cache latency.
 

Chobel

Member
Albert Penello thought that Europe is a country. I don't think you need any more explanation.

Albert Penello said:
Raist said:
"Eurozone" is not a country, it's 17 countries. If for some reason you want to count it as one country, then do it for the xbox one too, in which case it will launch in 7.35 countries (six of the first launch wave are part of the eurozone, so that's 35% of that zone only). Because counting 17 as 1 or 6 of these 17 as 6 depending on the console is a bit misleading.

#geographyfail.

I stand corrected. I'll make sure the internal teams are aware of this too.

WTF? Is this a joke? Not only him but also the internal teams? What the hell Microsoft?
 

KidBeta

Junior Member
400 cycles doesn't sound all that outrageous. Consider, L1 is 20x the cycles of LDS (according to the GCN slides presented at GDC Europe), and LDS is in the 10s of cycles for access, which means L1 is in at least the 200s of cycles. If L2 latency is even just 2x that of L1, then it will be 400 cycles long. Thus, if ESRAM is behind the L1/L2 cache hierarchy, then on a full miss it will at least incur the L2 cache latency.

He was misunderstanding what he was reading. I was talking about the time to invalidate and flush the entire cache, but I was off by a order of magnitude anyway, it seems to be 4096 cycles on eSRAM (128 bytes / cycle / 512KB cache to flush).
 
this is what you are saying having a uni degree in computer science..maybe i did something of that ....


and developing time for me could mean just ....oh you know the sony dev tools are x10 times better than x1 ..(and we already knew this)

and i repeat ..he talked about power..then changed is mind..is for this reason thats ridiculous

No, he never said anything different. He never talked about development time; this is just your wrong interpretation. When he said "when developing" he simply meant "while you are developing a game". He clearly said that during development of cross-platform games, the PS4 build generally runs 50% faster.
 

astraycat

Member
He was misunderstanding what he was reading. I was talking about the time to invalidate and flush the entire cache, but I was off by a order of magnitude anyway, it seems to be 4096 cycles on eSRAM (128 bytes / cycle / 512KB cache to flush).

Even if he was misunderstanding, 400 cycles sounds like an expected (if not reasonable) amount of latency for L2 on GCN. Thus, 400 cycles to ESRAM actually sounds pretty good.
 

KidBeta

Junior Member
Even if he was misunderstanding, 400 cycles sounds like an expected (if not reasonable) amount of latency for L2 on GCN. Thus, 400 cycles to ESRAM actually sounds pretty good.

indeed, but he seems to not realise i was talking about latency to flush entire cache to memory not latency to read the memory :p.
 

ekim

Member
The most important thing in that thread

shotsnotfiredj8ss1.gif

lol. But seriously - the 50% performance difference sounds reasonable IF he really means before optimization. (Aka bruteforcing) The more straightforward approach of the PS4 should make it easier to get a game to run on it without any hassle. If you do the same on the X1 you can't really make use of the special features. So it's basically the same as running it on 2 different PCs with different GPUs.

the difference should be smaller after optimization for both systems with the PS4 still performing better unless MS has still something to disclose.
 

Chobel

Member
lol. But seriously - the 50% performance difference sounds reasonable IF he really means before optimization. (Aka bruteforcing) The more straightforward approach of the PS4 should make it easier to get a game to run on it without any hassle. If you do the same on the X1 you can't really make use of the special features. So it's basically the same as running it on 2 different PCs with different GPUs.

the difference should be smaller after optimization for both systems with the PS4 still performing better unless MS has still something to disclose.

And that exactly what he said in his comment in that thread.
 

astraycat

Member
lol. But seriously - the 50% performance difference sounds reasonable IF he really means before optimization. (Aka bruteforcing) The more straightforward approach of the PS4 should make it easier to get a game to run on it without any hassle. If you do the same on the X1 you can't really make use of the special features. So it's basically the same as running it on 2 different PCs with different GPUs.

the difference should be smaller after optimization for both systems with the PS4 still performing better unless MS has still something to disclose.

There are contrived cases where the differences between the two GPUs will be 50% or greater ;)

For example, if you're bottlenecked not by compute but instead by needing at least 102GB/s of texture bandwidth to main memory, then the PS4 should be able to fufill those memory requests at least 50% faster, and up to ~159% faster in the extraordinarily contrived I'm-just-reading-and-not-writing-to-a-render-target-and-yet-I-need-at-least-176GB/s-from-main-memory case.

Granted, you'd have to fool the shader compiler to actually do the reads in such a case, but surely it could be engineered!
 

KidBeta

Junior Member
lol. But seriously - the 50% performance difference sounds reasonable IF he really means before optimization. (Aka bruteforcing) The more straightforward approach of the PS4 should make it easier to get a game to run on it without any hassle. If you do the same on the X1 you can't really make use of the special features. So it's basically the same as running it on 2 different PCs with different GPUs.

the difference should be smaller after optimization for both systems with the PS4 still performing better unless MS has still something to disclose.

Sure, but theres plenty of optimisations that work just as well on the PS4 as they do on the XBONE.
 

Chobel

Member
There are contrived cases where the differences between the two GPUs will be 50% or greater ;)

For example, if you're bottlenecked not by compute but instead by needing at least 102GB/s of texture bandwidth to main memory, then the PS4 should be able to fufill those memory requests at least 50% faster, and up to ~159% faster in the extraordinarily contrived I'm-just-reading-and-not-writing-to-a-render-target-and-yet-I-need-at-least-176GB/s-from-main-memory case.

Granted, you'd have to fool the shader compiler to actually do the reads in such a case, but surely it could be engineered!

It's now 109GB/s min, 204GB/s max. I don't know how they did/get that but this what MS showed in Hot Ships.
 

astraycat

Member
It's now 109GB/s min, 204GB/s max. I don't know how they did/get that but this what MS showed in Hot Ships.

That's to ESRAM only, which is why I say a contrived scenario where you'd need huge bandwidth to read from main memory. More realistically, there'd be a good deal bandwidth needed for the depth target and render target(s), which could possibly be feasibly offloaded to ESRAM's magical 204GB/s max bandwidth.
 

Vizzeh

Banned
It's now 109GB/s min, 204GB/s max. I don't know how they did/get that but this what MS showed in Hot Ships.

Isnt this only good news for repeat textures and code? So basically we have to live with repeat muddy textures on X1 and comparably on the PS4 we could have very detailed textures, differing from one another given the 32mb vs the Entire pool available GDDR5. So its not so much that the X1 Can keep up but what is it showing? obv X1 Still has the 68G/s but the trade off is still surely much lower resolutions/higher compression.
 
This is weird, how does he not know People Can Fly? They made the X360 exclusive, Gears of War: Judgment.

Did those former dev started a new studio or something.
Find it hard to believe he doesn't know People can fly.
But then again after GoW:J maybe its better from what i heard it was the shittiest entry in the franchise chasing dat cod crowd shame that halo did the same.
 

KidBeta

Junior Member
Isnt this only good news for repeat textures and code? So basically we have to live with repeat muddy textures on X1 and comparably on the PS4 we could have very detailed textures, differing from one another given the 32mb vs the Entire pool available GDDR5. So its not so much that the X1 Can keep up but what is it showing?

Well not exactly, its useful for anything you want to access often thats equal to or less then 32MB.

Also the most they have ever gotten out of it in the real world was 130GB/s which was with alpha blending, this seems a little more realistic.
 

Klocker

Member
I




From the rather detailed vgleaks, it would seem that Microsoft has done short of nothing to combat it (aside form the aforementioned eSRAM).


well that's my point

Vgleaks is not telling us everything about how they plan workarounds to have games developed efficiently

of course they have designed the system to account for many factors that you can manufacture asa problem. the esram is the major example and they have said in all of their talks that keeping the gpu filled efficiently was their main design goal.

again people who are not actively developing an xbone game they have no business telling everyone here exactly how the system will or will not fair. There are obviously work arounds to the things some of the armcahir computer scientists here are trying to pass along as "facts".

its getting a little silly

this thread needs to close. It's starting to sound like fud
 

KidBeta

Junior Member
well that's my point

Vgleaks is not telling us everything about how they plan workarounds to have games developed efficiently

of course they have designed they system to account for that as the esram is the major example.

again people are not actively developing an xbone game they have no business telling everyone here exactly how the system will or will not fair. There are obviously work around to the things some of the armcahir computer scientists here are trying to pass along as "facts".

its getting a little silly

this thread needs to close. It's starting to sound like fud

Okay i guess we need to close every single thread on console performance then? all of them?. Its pretty obvious with a basic CS background how certain things will fair, you do not need to develop for the machine to know this. Just like how you don't need to be a engineer working on making a car to know one is better then the other, we know a lot of the details and from that we can certainly ascertain what is going on. Wether or not that level of analysis and understanding is outside the realm of some posters is a different story though.

People can wish for all they want, but the fact of the matter is that everything I have said (aside from the minor corrections I have made) about this topic to do with GPGPU and coherency has been true based on detailed knowledge that vgleaks has, we do not need a developer to tell us how a cache invalidate / flush is bad, it should be common sense.
 

velociraptor

Junior Member
lol. But seriously - the 50% performance difference sounds reasonable IF he really means before optimization. (Aka bruteforcing) The more straightforward approach of the PS4 should make it easier to get a game to run on it without any hassle. If you do the same on the X1 you can't really make use of the special features. So it's basically the same as running it on 2 different PCs with different GPUs.

the difference should be smaller after optimization for both systems with the PS4 still performing better unless MS has still something to disclose.
But that suggests they could also further optimise for the PS4.
 

Vizzeh

Banned
This is weird, how does he not know People Can Fly? They made the X360 exclusive, Gears of War: Judgment.

EDIT: That Pic has one tweet removed , apparently there's reasonable explanation.
a4b5Rqk.png


While Checking the twitter log.

Albert Penello ‏@albertpenello 8h
@Datownkidd What devs?
Collapse Reply Retweet Favorite More
1
RETWEET nise8
10:27 PM - 7 Sep 13 · Details

reply:

Ivan Espinoza ‏@Datownkidd 8h
@albertpenello Adrian Chmielarz and his "friends".
Collapse Reply Retweet Favorite More
10:41 PM - 7 Sep 13 · Details

Ivan Espinoza ‏@Datownkidd 8h
@albertpenello Former People Can Fly dev.
Collapse Reply Retweet Favorite More
1
RETWEET nise8
10:43 PM - 7 Sep 13 · Details


Albert Penello ‏@albertpenello 8h
@Datownkidd don't know 'em.
Hide conversation Reply Retweet Favorite More
4
RETWEETS Shad0w59Kevin Mercadonise8shahadul
10:46 PM - 7 Sep 13 · Details


The time stamps suggests he seen it?
 

Klocker

Member
Okay i guess we need to close every single thread on console performance then? all of them?. Its pretty obvious with a basic CS background how certain things will fair, you do not need to develop for the machine to know this. Just like how you don't need to be a engineer working on making a car to know one is better then the other, we know a lot of the details and from that we can certainly ascertain what is going on. Wether or not that level of analysis and understanding is outside the realm of some posters is a different story though.

People can wish for all they want, but the fact of the matter is that everything I have said (aside from the minor corrections I have made) about this topic to do with GPGPU and coherency has been true based on detailed knowledge that vgleaks has, we do not need a developer to tell us how a cache invalidate / flush is bad, it should be common sense.


yes probably should because the armchair game devs here don't know how the hardware will be used exactly. That is correct. When you develop a game on xbone and are able to use all the tools and are privy to the developer meetings where Ms shares with you all of their pland and theories for how that hardware was designed to be used efficiently beyond basic brute force compute numbers and beyond vgleaks ...then yes.

until then it's their personal theory on something they do not completely understand and should not be presented here as fact.
 

Raist

Banned
How is it possible that someone does not know this?, especially someone so high up in a big corporation?, is it not taught in highschool in America or something?.

I'm honestly not sure if he doesn't know it or if he was just trying to push the "well it still launches in more countries than the 360" line of defense when they went from 21 to 13.
I knew it was BS so I pointed out that no, the 360 didn't launch in 8 countries but around 20, and he replied to that linking wikipedia which had a list of 8 "countries". Problem is, one of them was "eurozone".

So, yeah.


edit: that's how it went:

The program is in great shape, and nothing happening is really much different from the last three launches I've worked on - except we live in a world where information flows much more freely. Last generation, Xbox 360 launched in 3 countries, with another 8 following. So a 13 country launch is still more than Xbox 360.

Might wanna mention that the 360 launched in 20+ more countries (basically the whole of geographical europe) only a couple of weeks after US/JP launches maybe.

Since I'm old and my memory goes, I went to Wikipedia on this one. According to that, we were in 11 countries before the end of the year we launched. We rolled out another 15 the following year. So 13 in one year beats that.

http://en.wikipedia.org/wiki/Xbox_360_launch
 

KidBeta

Junior Member
yes probably should because the armchair game devs here don't know how the hardware will be used exactly. That is correct. When you develop a game on xbone and are able to use all the tools and are privy to the developer meetings where Ms shares with you all of their pland and theories for how that hardware was designed to be used efficiently beyond basic brute force compute numbers and beyond vgleaks ...then yes.

until then it's their personal theory on something they do not completely understand and should not be presented here as fact.

No one knows how the hardware will be used, not even Microsoft, thats the entire point of programmable devices, the point is its pretty easy to point out certain performance aspects of the system without being that privy to all of the information that developers get.

It is pretty easy to ascertain some of these characteristics, if you don't want to be involved in the discussion then go ahead, but I don't really think its fair that you try and get this and other threads closed because you yourself don't think you know enough to do a certain level of analysis on a machine without physically programming for it.

For example I can tell you that having low bandwidth to a GPU is not good, without programming for the device, people can tell you plenty of things about a system / design without actually programming for it and being privy to the very lowest details.
 
So are folks trying to spin Adrian C's tweet as ps4 being 50% faster Dev time? Holy shit this is a never ending saga of butt-hurt and denial.
 

Vizzeh

Banned
So are folks trying to spin Adrian C's tweet as ps4 being 50% faster Dev time? Holy shit this is a never ending saga of butt-hurt and denial.

A little bit, even though the original tweet said categorically it was "ps4 more powerful" -


Xbox One:
1.31 TFLOPS
40.9 GTex/s
13.6 GPix/s
68GB/s DDR3
109GB/s eSRAM
768 Shader units and 12 Compute units

PS4:
1.84 TFLOPS (+40%)
57.6 GTex/s (+40%)
25.6 GPix/s (+90%)
176GB/s GDDR5
1152 shader units and 18 compute units
PS4 GPGPU tasks 4 x compute command processors

So maybe he is factoring in everything we know and dont know...huma?

I dunno, but the last 2 threads got locked because, I think it was claimed that the 50% increase is second hand knowledge. Im not sure its a good idea going over it again, or this thread maybe locked too.
 

Klocker

Member
Kidbeta yes.... You can make generalized statements about computer science facts like that. But until we know more details about how system is designed to be leveraged and the tools ms has designed then I ask that people stop stating as "fact" what xbone will or won't be able to do. Because they really don't know.

they know certain laws of computing as you described above and theory on how it should or might work but generalizing as to what the "xbone will or won't be able to do compared to ps4" is not being sincere because it is impossible to state as fact unless you are developing a game on xbone.
 

Vizzeh

Banned
I would imagine it still doesn't mean he personally knows the people being referred to, specifically. Regardless, the overreactions are disconcerting, given AP posts here frequently.

By all means people, get excited about next gen, but let's avoid acting like you're taking sides in a conflict.

Yeah to be honest Albert is a good guy, I would rather not paint a negative picture on him, he doesn't seem the kind of guy to sidestep devs in such a condescending way. benefit of the doubt needs given imo :)
 

KidBeta

Junior Member
Yes. You can make generalized statements about computer science facts like that. But until we know more details about how system is designed to be leveraged and the tools ms has designed then I ask that people stop stating as "fact" what xbone will or won't be able to do. Because they really don't know.

they know certain laws of computing as you described above and theory on how it should or might work but generalizing as to what the "xbone will or won't be able to do compared to ps4" is not being sincere because it is impossible to state as fact unless you are developing a game on xbone.

They can both do the same things but at what speed is the question, and the question ive been asking and trying to answer is a little more complicated involving caching systems and coherency along with GPGPU but we seem to know enough to be able to answer these questions rather well.
 

artist

Banned
While Checking the twitter log.

Albert Penello ‏@albertpenello 8h
@Datownkidd What devs?
Collapse Reply Retweet Favorite More
1
RETWEET nise8
10:27 PM - 7 Sep 13 · Details

reply:

Ivan Espinoza ‏@Datownkidd 8h
@albertpenello Adrian Chmielarz and his "friends".
Collapse Reply Retweet Favorite More
10:41 PM - 7 Sep 13 · Details

Ivan Espinoza ‏@Datownkidd 8h
@albertpenello Former People Can Fly dev.
Collapse Reply Retweet Favorite More
1
RETWEET nise8
10:43 PM - 7 Sep 13 · Details


Albert Penello ‏@albertpenello 8h
@Datownkidd don't know 'em.
Hide conversation Reply Retweet Favorite More
4
RETWEETS Shad0w59Kevin Mercadonise8shahadul
10:46 PM - 7 Sep 13 · Details


The time stamps suggests he seen it?
And people were suggesting that the person who posted the original tweet image had REMOVED the Former People Can Fly dev tweet? smh.
 
A little bit, even though the original tweet said categorically it was "ps4 more powerful" -


Xbox One:
1.31 TFLOPS
40.9 GTex/s
13.6 GPix/s
68GB/s DDR3
109GB/s eSRAM
768 Shader units and 12 Compute units

PS4:
1.84 TFLOPS (+40%)
57.6 GTex/s (+40%)
25.6 GPix/s (+90%)
176GB/s GDDR5
1152 shader units and 18 compute units
PS4 GPGPU tasks 4 x compute command processors

So maybe he is factoring in everything we know and dont know...huma?

I dunno, but the last 2 threads got locked because, I think it was claimed that the 50% increase is second hand knowledge. Im not sure its a good idea going over it again, or this thread maybe locked too.

I'm well aware of the specs, I'm just amazed that people can take clear/concise comments and a forum post, and spin it into something else.
 

Ubiquitar

Neo Member
WTF? Is this a joke? Not only him but also the internal teams? What the hell Microsoft?

It's an American thing. Geography is not our forte. Like George Lopez told that joke once "my friend asked me where I was from so I told him Honduras. He then asked me what part of Mexico that was!" (to paraphrase)

I would like to clarify that I am well aware that the EU is many countries and not one, and that Honduras is also it's own country. This much I know.
 

BenouKat

Banned
So maybe he is factoring in everything we know and dont know...huma?

I think you think too much. The guy just make a "global" statement of what his friend told him.

Because it simply to say 50% than 40 or 37.5 or everything. It's a complete approximation, he didn't calculate the exact number before post, also the "50 % speed" is again an approximation. Speed, power, computation, whatever.

I think you should not start to launch an investigation for why he said 50% and not 48.75% or something like this.
 
Thread is going to get closed. I suggest we get back on topic,there isn't much else to discuss regarding Adrian C's post. It is what it is.
 

ypo

Member
lol. But seriously - the 50% performance difference sounds reasonable IF he really means before optimization. (Aka bruteforcing) The more straightforward approach of the PS4 should make it easier to get a game to run on it without any hassle. If you do the same on the X1 you can't really make use of the special features. So it's basically the same as running it on 2 different PCs with different GPUs.

the difference should be smaller after optimization for both systems with the PS4 still performing better unless MS has still something to disclose.

Awesome only the Xbone can have the special optimization. Gots to close the gap, exclusively on Xbone™.
 

Vizzeh

Banned
If imagine most people have these numbers memorized by now?

Really, it seems every thread involving any technical discussion is doomed to devolve into a dickwaving contest?

Yeah true, it was soley for context as the percentage quoted yesterday doesn't seem that far away from what we assumed was reasonable fact. It sorta lines up slightly. There is so many numbers being thrown around, 30%, 40%, 50%.

It was more of a question I suppose as while highlighted, were the other 10% was coming from to achieve full 50% power. Its getting a little old now though.

Edit: BenouKat, made a good point, its more than likely just a random assumption (altho I thought better from a relatively recognized dev)
 
Status
Not open for further replies.
Top Bottom