• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Albert Penello puts dGPU Xbox One rumor to rest

Status
Not open for further replies.
Euuugh... Seriously man, just stop this. It looks no better when you comment on how "we don't know the full story from MS' side" when the exact same thing can very well be said about your perspective on Sony's hardware and engineering. And again, you comment on this with an extremely obvious bias in the discussion. I don't think it's beneficial for you to participate in these discussions; again, I feel like it's going to do more damage to people's perspective of your (and other MS representatives') participation on these forums when you come in here and start engaging in hardware comparisons with your direct competition.

Oh who made you the judge? I think he has added more to these discussions than you have? And why wouldn't he have some bias towards an item he created and is invested in? Or do you want nothing that puts up a defense for "their" side.

Bah, I'm probably preaching to a choir here....
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Adding to that, each of our CU's is running 6% faster. It's not simply a 6% clock speed increase overall.

I really don't want to be disrespectful. And just to prevent unproductive ad hominems from others about being an engineer and not being an engineer, I am an engineer, but this statement of yours really does not make any sense at all.
 

shandy706

Member
I expect PC to win the face-offs easily and it will be mighty embarrassing for Sony. If the differences are small they still lose, because optimization. PC fans get the superior game and customer satisfaction, get to say games will continue to be optimized down the line as devs "get to grips / huma etc" so the face-off gaps will widen. Whilst PS4 fans will have no comeback.

Why are you so passionate about this?

Personally I buy exclusives for my consoles. PC is for cross platform.

Getting joy from a slight performance difference in actual gameplay is silly. Only if it truly cripples one version does it really matter.
 

krioto

Member
I don't understand why people can not appreciate his participation or perspective without trying to shut him up from giving the perspective form the other side?

Why try to shut down the conversation. He obviously would not be coming on here putting it on the line if he was not confident he could back it up when the games come out over the next year or two,

The "assumption that ps4 is auto-40% better in games" deserves to be challenged

it's because the spec list he posted above is utter rubbish - re-tag that man
 

velociraptor

Junior Member
Its the same thing...

You can't add bandwidth like that...

Also, how exactly are you folk calculating the 204GB/s value?

I thought you weren't aware of Sony's final CPU clock speed...

What is this even supposed to mean?

PS4 also has 30GB/s with Onion and Onion+...
He spoke about GPGPU but the PS4 has 6 additional compute units. That's pretty significant. It means the PS4 can offload more general purpose tasks than the Xbox One.

Furthermore, it appears that the 360 has more memory bandwidth than the Xbox One (based on Albert's calculations).
 

Finalizer

Member
The GameCube had weaker hardware than the Xbox and GameCube games were pretty competitive throughout the generation. Its not like this hasn't happened before. Based on AMDs description on next Gen console work, they were both targeting the best possible visuals for a console budget, Sony just happened to bet on GDDR5 but we already know MS couldn't go that route from the start because they absolutely wanted 8GB and nothing less. Some good timing and a little luck helped Sony be where they are today.

I'm not saying the Xbone won't be a good system, or that it can't have games that look good or perform well. I'm just saying, there's a power gap, and there's no getting around that simple fact. Feel free to debate how much that power disparity will actually end up showing up in games (if at all), but these attempts to pretend that there's still some minor secret sauce in the system that'll make up the difference (lol SHAPE will offload enough sound processing off the CPU to make up for the graphics), or that some magic software engineering on MS' part can make up for the sheer horsepower difference in GPUs are fools' errands.
 

jett

D-Member
• We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted.

wow-wow-the-hangoverikucr.gif
 

VanWinkle

Member
good post, I want more a Kitnect fitness game, please tell Phil.

Haha, of course you think it's a good post. Did you read any of the replies? Do you have a decent knowledge of how tech works? I feel like anybody who does would see some really simple and amateur mistakes. I like Albert, but that post was...lets say...not good.
 
I see my statements the other day caused more of a stir than I had intended. I saw threads locking down as fast as they pop up, so I apologize for the delayed response.

I was hoping my comments would lead the discussion to be more about the games (and the fact that games on both systems look great) as a sign of my point about performance, but unfortunately I saw more discussion of my credibility.

So I thought I would add more detail to what I said the other day, that perhaps people can debate those individual merits instead of making personal attacks. This should hopefully dismiss the notion I'm simply creating FUD or spin.

I do want to be super clear: I'm not disparaging Sony. I'm not trying to diminish them, or their launch or what they have said. But I do need to draw comparisons since I am trying to explain that the way people are calculating the differences between the two machines isn't completely accurate. I think I've been upfront I have nothing but respect for those guys, but I'm not a fan of the mis-information about our performance.

So, here are couple of points about some of the individual parts for people to consider:

• 18 CU's vs. 12 CU's =/= 50% more performance. Multi-core processors have inherent inefficiency with more CU's, so it's simply incorrect to say 50% more GPU.
• Adding to that, each of our CU's is running 6% faster. It's not simply a 6% clock speed increase overall.
• We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted.
• We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles.
• We understand GPGPU and its importance very well. Microsoft invented Direct Compute, and have been using GPGPU in a shipping product since 2010 - it's called Kinect.
• Speaking of GPGPU - we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU.

Hopefully with some of those more specific points people will understand where we have reduced bottlenecks in the system. I'm sure this will get debated endlessly but at least you can see I'm backing up my points.

I still I believe that we get little credit for the fact that, as a SW company, the people designing our system are some of the smartest graphics engineers around – they understand how to architect and balance a system for graphics performance. Each company has their strengths, and I feel that our strength is overlooked when evaluating both boxes.

Given this continued belief of a significant gap, we're working with our most senior graphics and silicon engineers to get into more depth on this topic. They will be more credible then I am, and can talk in detail about some of the benchmarking we've done and how we balanced our system.

Thanks again for letting my participate. Hope this gives people more background on my claims.

So why do we have devs saying the PS4 is so much more powerful/faster?

Why are basically all third party titles being displayed at shows being shown on PS4 hardware/kits?

It seems devs are firmly in the camp that PS4 is the console packing the most punch.
 

beast786

Member
I


• We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted.
I

This is insulting. And I am not even a tech guru. You don't just add bandwidth as ESRAM is not all of 8GB

What also amazes me is that your whole post defense is exactly what letterhead used in his article . Which also discredit him and his bias information source.

regardless , thanks for the info
 

Klocker

Member
He's an official Microsoft spokesman who is adding bandwidth numbers together and pretending the total is a meaningful number. That was bad in 2005; in 2013, it's frankly unbelievable.

I have heard this around here for 3 months now but on other boards where people know there tech stuff, it is acceptable. This is the only place where people cant accept it.

Did you help design and engineer their memory sub system? I know I didn't and I'm pretty sure Albert has talked to the engineers who did.
 
for instance 30gb/s is 3x the bandwidth of the PS4's coherent bandwidth, so PS4 is obviously 10gb/s, which we already know is not true.

I'm not dismissing you, but how do you know this? I thought vgleaks always said the ps4 cpu <20gb/s? That means it never hits 20gb/s right?
 

Vizzeh

Banned
I don't understand why people can not appreciate his participation or perspective without trying to shut him up from giving the perspective form the other side?

Why try to shut down the conversation. He obviously would not be coming on here putting it on the line if he was not confident he could back it up when the games come out over the next year or two,

The "assumption that ps4 is auto-40% better in games" deserves to be challenged
Ofc it needs to be challenged mate, but what he is saying is not making sense at all, he is supposed to be a tech guy and he's trying to blind people with science (apparently) I think Albert is a decent fellow but it sounds like damage control, blowing up. This isn't 1999, there's a lot of educated people on this board that just simply want the numbers to add up/make sense.
 

hawk2025

Member
&#8226; 18 CU's vs. 12 CU's =/= 50% more performance. Multi-core processors have inherent inefficiency with more CU's, so it's simply incorrect to say 50% more GPU.
&#8226; Adding to that, each of our CU's is running 6% faster. It's not simply a 6% clock speed increase overall.


I don't understand how these two statements are logically consistent.


By the distributive property:

(X + X + ... + X) * 1.06 = 1.06*X + 1.06*X + ... + 1.06*X.

And that's assuming the CUs work linearly.

Unless I'm misunderstanding you, you said the exact opposite -- having additional CUs does not result in a linear increase in power.

Therefore, your statement is actually THE OPPOSITE of what you are pointing out:

(X + X + ... + X) * 1.06 > 1.06*X + 1.06*X + ... + 1.06*X.

But then, by your very own logic, having each CU running 6% faster is actually **worse** than a 6% increase overall.
 

SenkiDala

Member
Third party vs. first party with a year difference.

Well yeah, I haven't thought about that. :D But still I don't think it's such a bad comparison. I hope so.

Whatever, the most important are still the games, there will be awesome games on PS4 and also on X1, no doubt about it. It just makes me tired to keep reading (yeah I dunno why I continue) about this specs war... I think I'm getting too old for that (almost 30), it's so pointless. My games of the gen are Xenoblade, The Last of Us, Mass Effect trilogy, Muramasa, Dead Rising, Lost Odyssey... So anyway, there'll be good games on every plateform. :)
 

pixlexic

Banned
It's not about arguing it's about presenting facts. Tech-heads are genuinely curious...even ones who aren't really interested in console gaming. So stop acting like everyone is here to put down the Xbone because our minds are made up on the PS4 being the more capable machine.

We want explanations. This was not one of those explanations.

I didn't say anything about putting down the xbone. It has nothing to do with that. I just know how these things go.. you never really know until the games start coming out.
 

kpaadet

Member
Precisely. Take State of Decay for example. Frame rate issues, tearing, glitches, and all but it was a fantastic game. It was first and foremost a fun game and that's what matters to me. Something like Second Son visually looks really good but if it isn't fun then that's it, game over. For me at least it's "I had fun playing this game, it's a really good game" > "look how many particle effects we can put in at one time". Sometimes when I get on here I feel like I'm the only one that thinks along these lines.

I don't get your argument are you saying State of Decay is fantastic because it has tearing, framerate issues etc? Would it be worse on better hardware? Of course games are king and in the end that is what matters, but having better hardware also often makes for better experiences and allows devs to do new things.
 

Kaako

Felium Defensor
SMH @ some of the statements in Albert's latest post. You do know that some people here actually know their shit and have already called you out on some of the bullshit statements in your post, right?
 

Soulflarz

Banned
I deleted half of it. I think I need to do it again.

edit: Albert, I just don't understand how you have the details on Sony's hardware to make these statements. What am I missing?
Thats not what bothers me

Isnt it kinda inferred if MS has amazing guys who will make everything work super fast so will sony, except for the one thats blatant aka " configured" RAM? Its assuming their competition isnt doing similar things
 

Curufinwe

Member
I have heard this around here for 3 months now but on other boards where people know there tech stuff, it is acceptable. This is the only place where people cant accept it.

It's an obviously misleading comparison because the ESRAM is only 32 MB, not 8GB.

The people on those other boards do not know their tech if they think it's acceptable.
 

AzerPhire

Member
You guys do realize he is not saying the Xb1 is more powerful than the PS4 right? He is only saying the difference is not 40% as most of you seem to believe.

Whether it is 30% or 10% does it really matter to you?
 

TheD

The Detective
I see my statements the other day caused more of a stir than I had intended. I saw threads locking down as fast as they pop up, so I apologize for the delayed response.

I was hoping my comments would lead the discussion to be more about the games (and the fact that games on both systems look great) as a sign of my point about performance, but unfortunately I saw more discussion of my credibility.

So I thought I would add more detail to what I said the other day, that perhaps people can debate those individual merits instead of making personal attacks. This should hopefully dismiss the notion I'm simply creating FUD or spin.

I do want to be super clear: I'm not disparaging Sony. I'm not trying to diminish them, or their launch or what they have said. But I do need to draw comparisons since I am trying to explain that the way people are calculating the differences between the two machines isn't completely accurate. I think I've been upfront I have nothing but respect for those guys, but I'm not a fan of the mis-information about our performance.

So, here are couple of points about some of the individual parts for people to consider:

&#8226; 18 CU's vs. 12 CU's =/= 50% more performance. Multi-core processors have inherent inefficiency with more CU's, so it's simply incorrect to say 50% more GPU.
&#8226; Adding to that, each of our CU's is running 6% faster. It's not simply a 6% clock speed increase overall.
&#8226; We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted.
&#8226; We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles.
&#8226; We understand GPGPU and its importance very well. Microsoft invented Direct Compute, and have been using GPGPU in a shipping product since 2010 - it's called Kinect.
&#8226; Speaking of GPGPU - we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU.

Hopefully with some of those more specific points people will understand where we have reduced bottlenecks in the system. I'm sure this will get debated endlessly but at least you can see I'm backing up my points.

I still I believe that we get little credit for the fact that, as a SW company, the people designing our system are some of the smartest graphics engineers around &#8211; they understand how to architect and balance a system for graphics performance. Each company has their strengths, and I feel that our strength is overlooked when evaluating both boxes.

Given this continued belief of a significant gap, we're working with our most senior graphics and silicon engineers to get into more depth on this topic. They will be more credible then I am, and can talk in detail about some of the benchmarking we've done and how we balanced our system.

Thanks again for letting my participate. Hope this gives people more background on my claims.

So much wrong with that post.

Graphics are a embarrassing parallel load, that means you have very little scaling hit.
You do not address the massive ROP difference, which will make it even faster than just having more CUs alone.

I do not see anything that points to it not being just a simple clockspeed increase.

You can not add up the bandwidth like that!
The PS4 can do its peak bandwidth across all 8GB vs a system that only has 68GB/S from it's main RAM and the majorly of it's bandwidth comes from a tiny (32MB) pool of SRAM!

You should not know the PS4's CPU clockspeed.
 

Finalizer

Member
Oh who made you the judge? I think he has added more to these discussions than you have? And why wouldn't he have some bias towards an item he created and is invested in? Or do you want nothing that puts up a defense for "their" side.

Bah, I'm probably preaching to a choir here....

I'm not acting as a judge on anything, I'm just giving my own personal advice based on what I'm seeing in the situation.

Here's the fact of the matter - his angle on Sony's hardware and engineering is no more informed than anyone else's here, aside from maybe some of the Sony insiders. Therefore, he adds nothing of additional value to these discussions compared to any typical console warrior. Add to that his status as an MS representative on these forums, and it becomes easy to paint him as obviously marketing his product on misinformation and trying to twist the truth - regardless of whether you personally think that, he's opening himself up to be criticized by others for that very reason. It poisons his image, and that of MS reps in general, and that's not something he needs to do to himself at this point.

That's why I've said numerous times in the past - MS can use their presence here to open a direct line of communication between fans and the Gaf audience to someone directly related to MS. That's actually very powerful, and can be very convenient for both sides. Something like a few weeks ago, where we were able to get direct clarification on the Kinect functionality issues in some of the launch countries, that's exactly the kind of cool thing that we get out of this (even if I'm disappointed that there's still no "official chart" to come, but I remain at least hopeful that it's somewhere in the works...) and seeing more of that would do well to build back up some good will toward MS. However, that's not going to happen when it's very easy to point to some of his posts and just say "he's pushing his agenda on the forums."

So that's why I say, he (and Nelson, and any other reps - MS or otherwise) should stay in the sidelines in these debates. If they want to correct outright lies, fine, but trying to add their own piece to these discussions in particular frankly does them no good.
 
Let me give you an analogy. You can have two cars.

One car is a BMW 335i, a 330hp coupe. You have tuned the suspension settings. The gear ratios are perfect. The tyres have the 'right' pressure and width. The torque, camber, ride height, aerodynamics are all 'balanced' - you get the picture.

The other is a BMW M3. It has no such tuning. but is packing a 414HP V8 engine.

What will you place your bets on?

M3 because it's already tuned and balanced from the get go.
 

AustinFelix

Unconfirmed Member
Albert , or anyone that knows here.

The ps4 features a separate chip to encode decode for the streaming feature as to not take away performance from games, is this the same for xbone?
 

VanWinkle

Member
You guys do realize he is not saying the Xb1 is more powerful than the PS4 right? He is only saying the difference is not 40% as most of you seem to believe.

Whether it is 30% or 10% does it really matter to you?

People were pointing out the things he wrote as facts that were simply wrong. Should people have just sat idly by and pretend he didn't say anything?
 
Hmm. Who to believe.

"Every developer we've talked to said there's a 50% speed difference"

or

"differences are greatly overstated", despite openly admitting they don't know the engineering design inside the PS4
 

Bsigg12

Member
Well of course but their are different ways to support it, talk about things your project can do that the opposition can't, don't continue to bring up the raw power aspect as it's a losing battle.

Well he has been talking about those other things. In this context he was putting a rumor to rest regarding technical stuff. You can't fault him here since he was being questioned about a potential dGPU constantly since that stupid article was written.
 

FINALBOSS

Banned
You guys do realize he is not saying the Xb1 is more powerful than the PS4 right? He is only saying the difference is not 40% as most of you seem to believe.

Whether it is 30% or 10% does it really matter to you?

Heaven forbid facts satiate people's questions. Albert did not give us facts.
 
I'm surprised Albert even bothers to post on here with the way some of you speak to him..

Me too. You would think being the designated tech guy and get consistently get called out for your bullshit would be embarrassing enough. The "We CREATED DirectX" was hilarious, but that post took the cake. He really believed he could just add the peak bandwidth and everyone would fall for it.

That is definitely his last post on gaf, guarantee it.
 
Status
Not open for further replies.
Top Bottom