• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Albert Penello puts dGPU Xbox One rumor to rest

Status
Not open for further replies.

nib95

Banned
ilERlIuaHoeTm.gif

Damn. Amazing lol.
 

Finalizer

Member
People have pointed out that MS (seemingly) only started gaining interest in GAF after the DRM fiasco. I wouldn't be at all surprised that they didn't know what kind of a den of truth wolves they were jumping into and were hoping to that the spin would trickle down from the "hardcore" to their more casual friends.

And based on what I've seen, that's just backfiring spectacularly.

Because he's engaging in these discussions, it's so easy to see his participation on these forums as being designed to do just that - spin news in the hopes of changing the narrative. And that kind of image is not what MS needs to be painting themselves with right now, not when there are plenty of other things that Penello and co. could be doing around here instead, plenty of things that could be building genuine good will.

EDIT:

I have to imagine it's a situation where someone who has been closely associated with the project for years doesn't want to downplay his creation. Albert's post are usually honest and do away with the PR speak, but he's not going to admit there's a significant difference in power, especially if that difference has a tendency to be exaggerated.

That's why I say he shouldn't get into these discussions in the first place - his own additions to these discussions aren't going to provide anything of value to them, and simply open up someone directly related to MS to criticism on the whole matter.

Here's another perspective - despite having multiplat titles that were demonstrably inferior, the PS3 still went on to do fine, and has managed to keep general hardware sales parity with the 360. So if hardware differences don't matter in such a significant way to the grand scheme of things, why open yourself up to that discussion in the first place? Focus on what's important, and on building back up the good will and trust of the community. Don't drag that down by getting dirty in the mudslinging.
 

nib95

Banned
Are you kidding? I mean really - are you kidding?

This is part and parcel of the territory here. You have to answer for your statements, especially if you're here in an official capacity. People get banned for being out of line, but poking holes in the arguments of other posters is well within the rules.

I've had my work here both praised and eviscerated, called out by numerous forum folks both publicly and via PM when I got stuff wrong, and I'm a goddamn admin. Guess what - I wouldn't have it any other way. That is what makes NeoGAF what it is.

There are many, many people who are more than capable of assessing, vetting and debunking technical claims and they have every right to do so. That's the price of doing business here. If we had official Nintendo or Sony reps on board, they would be subject to the same process.

If you're scared, buy a dog.

Holy crap this thread keeps giving. What a fantastic post.
 

Skeff

Member
Sure. If you say so

look at my posts, Albert clearly stated at 30gb/s the xb1 has 3x more coherent bandwidth than the PS4. This means Albert is stating the PS4 has a maximum of 10gb/s coherent bandwidth, We know this is factually incorrect.
 

KidBeta

Junior Member
Sure. If you say so

I'm sorry to break it to you, there are people on here who are very familiar with what he is talking about and to anyone who understands at least the basics of a lot if it, you can easily tell its complete crap.

You can't simply add up all your bandwidth from different memory pools and say 'thats that'. You can't say there is a lack of parallel scaling on graphics when 99% of the time the problem is hugely parallel which is why graphics cards go with so many concurrent SIMD units that are clocked lower then say a CPU.

He is spreading disinformation, but its his job so you can't really blame him, the problem starts when people like you blindly believe what he says, when he is wrong, oh so very wrong.
 
The simultaneous read/write at 200gb/s is completely theoretical where as the 176gb/s in PS4, just as the 109 gb/s or the 68gb/s in the XB1 are both reasonable expectations.

tl:dr
176 is the absolute maximum possible
68 is fine
109 is the guaranteed
200gb/s+ is not close.
Also you don't just add them up.


Fix that for you. And your statement is Incorrect, the 176GB/s is the absolute maximum, NOT the reasonable expectation.


Here's where that number comes from:


Taking the effective speed (5500) x the bus width (256-bit) gives you the maximum aka theoretical bandwidth

5,500 x 32 (1 per 8-bit) = 176GB/s

That right there is an absolute maximum, that's not an average or the reasonably attainable figure.
 
I know PS4 is more powerful, but I do think the XB1 has some advantages in some areas. but overall the PS4 is more powerful, I don't think any regular xbox fan on this site doubts this. It hasn't swayed my purchasing opinion, I'm excited to see what kinect can bring to the table.

And that's a perfectly reasonable response. People should feel free to choose and prefer whatever console they want, regardless of the specs, and be comfortable in that decision. People don't need justify a decision as "better" on every level.
 
Or maybe they just want to correct some misinformation

Well, sure. I'd be happy to take that as an explanation on its own for Albert, but that would be in isolation. I'd have to ignore his misuse of the ESRAM discussion, and all the Simpson DNA evidence, which would just be stupid.
 

RoboPlato

I'd be in the dick
Penello is still adamant that the power disparity between the two consoles is still not as huge as commonly believed. Based on the data we currently know, do you tech savvy guys believe there is any feasible way that he could be correct or are his arguments just damage control?

I think he's probably basing it on the performance of cross gen games. There won't be much of a disparity in those between XBO and PS4 since they're both so far ahead of the current gen versions. We won't see what the power gap is really like until we get multiplatform, next gen only games.
 

sangreal

Member
He claims that it's not a simple 6% increase.

It's a 6% increase per CU!...as though that makes a difference???

Using the logic he used for the memory bandwidth, that means 6% increase x 12 CU's = 72% increase, putting it well beyond the PS4!

There is nothing inherently wrong with his argument, it's just being incorrectly applied. His argument is simply that adding cores is not always beneficial over increasing the speed of the cores you actually use. This is true in general computing, the only problem is that it really isn't true with graphics programming, and the GCN architecture which are heavily parallel. It's a pretty easy mistake to make, and not at all crazy -- just wrong

Don't worry he's not a PR and is a MS engineer so I trust him.

he isn't pr or an engineer
 

Vizzeh

Banned
And based on what I've seen, that's just backfiring spectacularly.

Because he's engaging in these discussions, it's so easy to see his participation on these forums as being designed to do just that - spin news in the hopes of changing the narrative. And that kind of image is not what MS needs to be painting themselves with right now, not when there are plenty of other things that Penello and co. could be doing around here instead, plenty of things that could be building genuine good will.

Luckily GAF is viewed by many but think of the poor souls on other less informed forums, just lapping all this miss-information all doe-eyed with silver spoons.

If anything IMO, this puts a bit of a nail in the coffin as it shows last resort, was spin and they are hiding no tech secret sauce, as this would be the time to show it.
 

Caronte

Member
I haven't really been following this but why is MS making such a big deal about this now when they were proudly proclaiming back during the reveal that they purposefully did not target the high-end graphics?

Pre-orders and damage control, like everything they have done since the reveal.
 

Hollow

Member
Are you kidding? I mean really - are you kidding?

This is part and parcel of the territory here. You have to answer for your statements, especially if you're here in an official capacity. People get banned for being out of line, but poking holes in the arguments of other posters is well within the rules.

I've had my work here both praised and eviscerated, called out by numerous forum folks both publicly and via PM when I got stuff wrong, and I'm a goddamn admin. Guess what - I wouldn't have it any other way. That is what makes NeoGAF what it is.

There are many, many people who are more than capable of assessing, vetting and debunking technical claims and they have every right to do so. That's the price of doing business here. If we had official Nintendo or Sony reps on board, they would be subject to the same process.

If you're scared, buy a dog.

Thumbs up.

This is why I read GAF. I want brutal honesty.
 

Pain

Banned
Does Albert even know Sonys final specs and clock speeds? Seems to me he's just going by rumors, same as us. If your going to compare, don't just list your strengths. Do a fair comparison and mention both the positive and the negative of both platforms. That's impossible unless MS knows PS4s specs. I doubt they do.

I look forward to see Albert in the Digital Foundry comparison threads. It may not be as one-sided as we once thought.
 

Klocker

Member
look at my posts, Albert clearly stated at 30gb/s the xb1 has 3x more coherent bandwidth than the PS4. This means Albert is stating the PS4 has a maximum of 10gb/s coherent bandwidth, We know this is factually incorrect.

I have also read in one of the many threads here that 10gb/s so it's not the first time I have seen it.

people here disputed that but know it is not 30 on ps4 so I think it is in that neighborhood but understood to be less than 30. I think we may need to wait for sure ?

maybe someone can find the details where the 10 actually came from
 

benny_a

extra source of jiggaflops
I don't get it. Don't try to fool people into thinking the specs are the same.

Take those damn Unique Selling Points and market those. There is legitimately cool shit in the Xbox One.

Like my post from the dasboard leak thread: Push the multi-tasking like browsing, watching Youtube or twitch all while the game continues to run in the background.
Depending on what other things you can do with the console, I think it could be quite useful. I'm mainly thinking of long loading times, or multiplayer lobbies where you're waiting for the next round.

Maybe even a Fighting Game lobby where it isn't your turn for awhile.

It comes at a performance price because two things have to run simultaneously but it can be cool.
 

FINALBOSS

Banned
The question we should all be making to Albert Penello is what the hell happened to cloud computing?

It's another bullet point among the many that Microsoft through at the wall that didn't stick.

We've gone from we didn't target the highest-end specs (didn't stick), to cloud computing (didn't stick), to ESRAM (didn't stick), to upclocks (didn't stick), to there's negligible power difference (didn't stick as shown in this thread).
 
At Microsoft, we have a position called a "Technical Fellow" These are engineers across disciplines at Microsoft that are basically at the highest stage of technical knowledge. There are very few across the company, so it's a rare and respected position.

We are lucky to have a small handful working on Xbox.

I've spent several hours over the last few weeks with the Technical Fellow working on our graphics engines. He was also one of the guys that worked most closely with the silicon team developing the actual architecture of our machine, and knows how and why it works better than anyone.

So while I appreciate the technical acumen of folks on this board - you should know that every single thing I posted, I reviewed with him for accuracy. I wanted to make sure I was stating things factually, and accurately.

So if you're saying you can't add bandwidth - you can. If you want to dispute that ESRAM has simultaneous read/write cycles - it does.

I know this forum demands accuracy, which is why I fact checked my points with a guy who helped design the machine.

This is the same guy, by the way, that jumps on a plane when developers want more detail and hands-on review of code and how to extract the maximum performance from our box. He has heard first-hand from developers exactly how our boxes compare, which has only proven our belief that they are nearly the same in real-world situations. If he wasn't coming back smiling, I certainly wouldn't be so bullish dismissing these claims.

I'm going to take his word (we just spoke this AM, so his data is about as fresh as possible) versus statements by developers speaking anonymously, and also potentially from several months ago before we had stable drivers and development environments.
 

Spongebob

Banned
There is nothing inherently wrong with his argument, it's just being incorrectly applied. His argument is simply that adding cores is not always beneficial over increasing the speed of the cores you actually use. This is true in general computing, the only problem is that it really isn't true with graphics programming, and the GCN architecture which are heavily parallel. It's a pretty easy mistake to make, and not at all crazy -- just wrong



he isn't pr or an engineer
Like the rest of his post.
 

Bsigg12

Member
This thread is going places! I'm not privy to technical discussions but I do enjoy reading peoples reaction to when Albert or someone else in a position of know talks about technical specs.
 
At Microsoft, we have a position called a "Technical Fellow" These are engineers across disciplines at Microsoft that are basically at the highest stage of technical knowledge. There are very few across the company, so it's a rare and respected position.

We are lucky to have a small handful working on Xbox.

I've spent several hours over the last few weeks with the Technical Fellow working on our graphics engines. He was also one of the guys that worked most closely with the silicon team developing the actual architecture of our machine, and knows how and why it works better than anyone.

So while I appreciate the technical acumen of folks on this board - you should know that every single thing I posted, I reviewed with him for accuracy. I wanted to make sure I was stating things factually, and accurately.

So if you're saying you can't add bandwidth - you can. If you want to dispute that ESRAM has simultaneous read/write cycles - it does.

I know this forum demands accuracy, which is why I fact checked my points with a guy who helped design the machine.

This is the same guy, by the way, that jumps on a plane when developers want more detail and hands-on review of code and how to extract the maximum performance from our box. He has heard first-hand from developers exactly how our boxes compare, which has only proven our belief that they are nearly the same in real-world situations. If he wasn't coming back smiling, I certainly wouldn't be so bullish dismissing these claims.

I'm going to take his word (we just spoke this AM, so his data is about as fresh as possible) versus statements by developers speaking anonymously, and also potentially from several months ago before we had stable drivers and development environments.

:lol

Argument from Authority.

You're off to a great start. You're not posting any further evidence or reasoning to back up these claims from this "expert". Have him write it all up for us if you believe in its accuracy so much.

To say nothing of your amazing inside knowledge of Sony's specs.
 

Skeff

Member
Fix that for you. And your statement is Incorrect, the 176GB/s is the absolute maximum, NOT the reasonable expectation.


Here's where that number comes from:

The nominal speed of the ram is 1,375Mhz. GDDR5 makes it 5,500Mhz effective.


Taking the effective speed (5500) x the bust width (256-bit) gives you the maximum aka theoretical bandwidth

5,500 x 32 (1 per 8-bit) = 176GB/s

That right there is an absolute maximum, that's not an average by any stretch of the word.

No you didn't. Did I say average? or did I say reasonable expectation? Please learn to read.
 

Finalizer

Member
Luckily GAF is viewed by many but think of the poor souls on other less informed forums, just lapping all this miss-information all doe-eyed with silver spoons.

Really, the only ones I see lapping this information up are the ones who wanted to believe it in the first place. I don't see his contributions really doing much to change perspectives in the grand scheme of things. I see it only muddying MS' PR image even further.
 

Skeff

Member
At Microsoft, we have a position called a "Technical Fellow" These are engineers across disciplines at Microsoft that are basically at the highest stage of technical knowledge. There are very few across the company, so it's a rare and respected position.

We are lucky to have a small handful working on Xbox.

I've spent several hours over the last few weeks with the Technical Fellow working on our graphics engines. He was also one of the guys that worked most closely with the silicon team developing the actual architecture of our machine, and knows how and why it works better than anyone.

So while I appreciate the technical acumen of folks on this board - you should know that every single thing I posted, I reviewed with him for accuracy. I wanted to make sure I was stating things factually, and accurately.

So if you're saying you can't add bandwidth - you can. If you want to dispute that ESRAM has simultaneous read/write cycles - it does.

I know this forum demands accuracy, which is why I fact checked my points with a guy who helped design the machine.

This is the same guy, by the way, that jumps on a plane when developers want more detail and hands-on review of code and how to extract the maximum performance from our box. He has heard first-hand from developers exactly how our boxes compare, which has only proven our belief that they are nearly the same in real-world situations. If he wasn't coming back smiling, I certainly wouldn't be so bullish dismissing these claims.

I'm going to take his word (we just spoke this AM, so his data is about as fresh as possible) versus statements by developers speaking anonymously, and also potentially from several months ago before we had stable drivers and development environments.

Albert, would you be able to clarify your previous statement regarding the coherent bandwidth of both machines, as this does not currently align with the known specifications of both boxes.
 

FINALBOSS

Banned
So tech guys, we just have to take Albert's word for it since he got his info from one of the few technical geniuses (which are like a rare flower) at Microsoft?
 

Pain

Banned
Fix that for you. And your statement is Incorrect, the 176GB/s is the absolute maximum, NOT the reasonable expectation.


Here's where that number comes from:

The nominal speed of the ram is 1,375Mhz. GDDR5 makes it 5,500Mhz effective.


Taking the effective speed (5500) x the bus width (256-bit) gives you the maximum aka theoretical bandwidth

5,500 x 32 (1 per 8-bit) = 176GB/s

That right there is an absolute maximum, that's not an average or the reasonably attainable figure.
It's much easier to utilize GDDR5 though. That alone gives it an advantage even if the XB1 has the ability to sometimes surpass PS4s bandwidth.
 
Are you kidding? I mean really - are you kidding?

This is part and parcel of the territory here. You have to answer for your statements, especially if you're here in an official capacity. People get banned for being out of line, but poking holes in the arguments of other posters is well within the rules.

I've had my work here both praised and eviscerated, called out by numerous forum folks both publicly and via PM when I got stuff wrong, and I'm a goddamn admin. Guess what - I wouldn't have it any other way. That is what makes NeoGAF what it is.

There are many, many people who are more than capable of assessing, vetting and debunking technical claims and they have every right to do so. That's the price of doing business here. If we had official Nintendo or Sony reps on board, they would be subject to the same process.

If you're scared, buy a dog.

This is fantastic. It really is.
 
D

Deleted member 80556

Unconfirmed Member
EDIT: Then, Albert, we want to see a more detailed write up, hopefully from him (although I understand that he might be too busy to write something for an internet forum), because just saying that you know a guy who knows his stuff isn't cutting it. It's the equivalent of any other user saying they have insider knowledge when they don't even prove it.

So please, prove it to us, Albert.

I deleted half of it. I think I need to do it again.

edit: Albert, I just don't understand how you have the details on Sony's hardware to make these statements. What am I missing?

I would really like to know this as well.

And we all know that the Guerrilla powerpoint presentation is outdated just by going with RAM allocation. So using the specs from there is using inaccurate information.
 
No you didn't. Did I say average? or did I say reasonable expectation? Please learn to read.


So the absolute maximum possible, counting every single cycle in the most ideal situation possible that's based on that formula and has nothing to do with real world... is what you call the "reasonable expectation" Did I read that right? Serious question...
 

Troll

Banned
At Microsoft, we have a position called a "Technical Fellow" These are engineers across disciplines at Microsoft that are basically at the highest stage of technical knowledge. There are very few across the company, so it's a rare and respected position.

We are lucky to have a small handful working on Xbox.

I've spent several hours over the last few weeks with the Technical Fellow working on our graphics engines. He was also one of the guys that worked most closely with the silicon team developing the actual architecture of our machine, and knows how and why it works better than anyone.

So while I appreciate the technical acumen of folks on this board - you should know that every single thing I posted, I reviewed with him for accuracy. I wanted to make sure I was stating things factually, and accurately.

So if you're saying you can't add bandwidth - you can. If you want to dispute that ESRAM has simultaneous read/write cycles - it does.

I know this forum demands accuracy, which is why I fact checked my points with a guy who helped design the machine.

This is the same guy, by the way, that jumps on a plane when developers want more detail and hands-on review of code and how to extract the maximum performance from our box. He has heard first-hand from developers exactly how our boxes compare, which has only proven our belief that they are nearly the same in real-world situations. If he wasn't coming back smiling, I certainly wouldn't be so bullish dismissing these claims.

I'm going to take his word (we just spoke this AM, so his data is about as fresh as possible) versus statements by developers speaking anonymously, and also potentially from several months ago before we had stable drivers and development environments.


Ok. I believe.
 

Raymo

Member
At Microsoft, we have a position called a "Technical Fellow" These are engineers across disciplines at Microsoft that are basically at the highest stage of technical knowledge. There are very few across the company, so it's a rare and respected position.

We are lucky to have a small handful working on Xbox.

I've spent several hours over the last few weeks with the Technical Fellow working on our graphics engines. He was also one of the guys that worked most closely with the silicon team developing the actual architecture of our machine, and knows how and why it works better than anyone.

So while I appreciate the technical acumen of folks on this board - you should know that every single thing I posted, I reviewed with him for accuracy. I wanted to make sure I was stating things factually, and accurately.

So if you're saying you can't add bandwidth - you can. If you want to dispute that ESRAM has simultaneous read/write cycles - it does.

I know this forum demands accuracy, which is why I fact checked my points with a guy who helped design the machine.

This is the same guy, by the way, that jumps on a plane when developers want more detail and hands-on review of code and how to extract the maximum performance from our box. He has heard first-hand from developers exactly how our boxes compare, which has only proven our belief that they are nearly the same in real-world situations. If he wasn't coming back smiling, I certainly wouldn't be so bullish dismissing these claims.

I'm going to take his word (we just spoke this AM, so his data is about as fresh as possible) versus statements by developers speaking anonymously, and also potentially from several months ago before we had stable drivers and development environments.

Oh boy.
 

LiquidMetal14

hide your water-based mammals
These next couple of months are going to be irritatingly hard to get by if this kind of posting continues. Especially when MSFT guys are spouting whatever they want while Sony goes about their business.

If some write up comes from the MSFT "fellows" you can bet it will be very pro Xbone.
 

hawk2025

Member
At Microsoft, we have a position called a "Technical Fellow" These are engineers across disciplines at Microsoft that are basically at the highest stage of technical knowledge. There are very few across the company, so it's a rare and respected position.

We are lucky to have a small handful working on Xbox.

I've spent several hours over the last few weeks with the Technical Fellow working on our graphics engines. He was also one of the guys that worked most closely with the silicon team developing the actual architecture of our machine, and knows how and why it works better than anyone.

So while I appreciate the technical acumen of folks on this board - you should know that every single thing I posted, I reviewed with him for accuracy. I wanted to make sure I was stating things factually, and accurately.

So if you're saying you can't add bandwidth - you can. If you want to dispute that ESRAM has simultaneous read/write cycles - it does.

I know this forum demands accuracy, which is why I fact checked my points with a guy who helped design the machine.

This is the same guy, by the way, that jumps on a plane when developers want more detail and hands-on review of code and how to extract the maximum performance from our box. He has heard first-hand from developers exactly how our boxes compare, which has only proven our belief that they are nearly the same in real-world situations. If he wasn't coming back smiling, I certainly wouldn't be so bullish dismissing these claims.

I'm going to take his word (we just spoke this AM, so his data is about as fresh as possible) versus statements by developers speaking anonymously, and also potentially from several months ago before we had stable drivers and development environments.


Fine.

Then can you explain this (it's back on page 6):

I don't understand how these two statements are logically consistent.


By the distributive property:

(X + X + ... + X) * 1.06 = 1.06*X + 1.06*X + ... + 1.06*X.

And that's assuming the CUs work linearly.

Unless I'm misunderstanding you, you said the exact opposite -- having additional CUs does not result in a linear increase in power.

Therefore, your statement is actually THE OPPOSITE of what you are pointing out:

(X + X + ... + X) * 1.06 > 1.06*X + 1.06*X + ... + 1.06*X.

But then, by your very own logic, having each CU running 6% faster is actually **worse** than a 6% increase overall.


This is pure, simple **algebra**, with absolutely no need of technical knowledge whatsoever, and points out two directly conflicting points on your list. What am I missing?
 
At Microsoft, we have a position called a "Technical Fellow" These are engineers across disciplines at Microsoft that are basically at the highest stage of technical knowledge. There are very few across the company, so it's a rare and respected position.

We are lucky to have a small handful working on Xbox.

I've spent several hours over the last few weeks with the Technical Fellow working on our graphics engines. He was also one of the guys that worked most closely with the silicon team developing the actual architecture of our machine, and knows how and why it works better than anyone.

So while I appreciate the technical acumen of folks on this board - you should know that every single thing I posted, I reviewed with him for accuracy. I wanted to make sure I was stating things factually, and accurately.

So if you're saying you can't add bandwidth - you can. If you want to dispute that ESRAM has simultaneous read/write cycles - it does.

I know this forum demands accuracy, which is why I fact checked my points with a guy who helped design the machine.

This is the same guy, by the way, that jumps on a plane when developers want more detail and hands-on review of code and how to extract the maximum performance from our box. He has heard first-hand from developers exactly how our boxes compare, which has only proven our belief that they are nearly the same in real-world situations. If he wasn't coming back smiling, I certainly wouldn't be so bullish dismissing these claims.

I'm going to take his word (we just spoke this AM, so his data is about as fresh as possible) versus statements by developers speaking anonymously, and also potentially from several months ago before we had stable drivers and development environments.


Can you ask that person do an AMA on here?
 
Are you kidding? I mean really - are you kidding?

This is part and parcel of the territory here. You have to answer for your statements, especially if you're here in an official capacity. People get banned for being out of line, but poking holes in the arguments of other posters is well within the rules.

I've had my work here both praised and eviscerated, called out by numerous forum folks both publicly and via PM when I got stuff wrong, and I'm a goddamn admin. Guess what - I wouldn't have it any other way. That is what makes NeoGAF what it is.

There are many, many people who are more than capable of assessing, vetting and debunking technical claims and they have every right to do so. That's the price of doing business here. If we had official Nintendo or Sony reps on board, they would be subject to the same process.

If you're scared, buy a dog.

Love this.

timlot
Banned
Today, 09:29 PM, Post #348

Finally get to use my new gif.

2013_%2525209_%2525206_%2525203_21.gif
 
Status
Not open for further replies.
Top Bottom