• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ars Technica: Penello's XB1 Numbers, Ars "Sniff Test" Analysis

sangreal

Member
Penello: "18 CUs [compute units] vs. 12 CUs =/= 50% more performance. Multi-core processors have inherent inefficiency with more CUs, so it's simply incorrect to say 50% more GPU."

Ars: "The entire point of GPU workloads is that they scale basically perfectly, so 50% more cores is in fact 50% faster."

That's what I always thought, but ERP (former ms, former scea) shot that down on b3d:
FWIW my expectation is that PS4 ought to have a performance advantage, but I wouldn't expect it to reflect the difference in CU counts. CU's are MASSIVELY underutilized on vertex heavy workloads and plenty of the frame will be ROP or bandwidth limited.
There are just too many little things that can significantly impact performance, there were times with early firmwares on PS4 where seemingly innocuous changes would affect performance by as much as 10%.

The eSRAM will certainly provide an advantage under some circumstances, and I'm interested if the ROP difference will end up being a factor, or the lower eSRAM latency will end up nullifying it.

For that matter I could imagine the graphics API having a significant effect, the CPU single threaded performance isn't great on these machines and a poorly conceived implementation, could hurt games across the board. Sony having "lower level access" here isn't necessarily a win.

Hell I could imagine cases where CPU limited games demonstrate an advantage on XB1.

Penello: "Adding to that, each of our CUs is running 6% faster. It's not simply a 6% clock speed increase overall."

Ars: "What the hell does that even mean?"

My understanding is that he is making the same point as before. 6% per core obviously gives you 6% overall, but that is only if each core is at 100% utilization and he just got finished making that point. So tied with his previous point, I believe he is trying to demonstrate that each core will benefit even without 100% utilization, unlike simply adding more cores

Penello: "We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted."

Ars: "Just adding up bandwidth numbers is idiotic and meaningless. While the Xbox One's ESRAM is a little faster, we don't know how it's used, and the PS4's GDDR5 is obviously a lot bigger."

There is nothing wrong with adding the ESRAM and DDR3 bandwidth because the GPU can read from both at the same time. Yes, it is a poor comparison because one is 32mb and one is 8gb but that is a different argument. One that is dependent on workload, which the author admits to knowing how it will be used. This isn't the same as adding the CPU bandwidth or something to get total system bandwidth like MS did for 360 (and were incorrectly accused of doing for xb1). That is idiotic and meaningless
 

Toki767

Member
That's the WiFi frequency...

I thought that was yhe GDDR5 and has nothing to do with the CPU??

I thought there were killzone shadowfall slides released that indicated it was 1.6.

Man so many numbers being thrown around the past few months it's been hard to keep track.

sony-ps4-dev-kit-specs.jpg

Was that not talking about the CPU?
 

Krakn3Dfx

Member
I thought there were killzone shadowfall slides released that indicated it was 1.6.

It was 1.6 then, that was awhile ago and a very early dev kit.

MS' change to 1.75 has been very recent, pretty sure even Penello said that the devkits at Gamescom and probably even PAX were still running at 1.6.
 
I appreciate him taking the time to post and interact with the comnunity, but he's doing more harm than good with everything he says making headlines daily (and ALL of it negative in nature). If Microsoft was smart they'd shut him down before he does even more harm.

Most industry folks get PR training before they're allowed to do interviews or post, and his recent posts are prime examples of engagements that should be avoided at all costs.
 
I agree with this, but I also understand that this specs war thing is beyond beating a dead horse at this point. We're never going to see MS come out and openly say their specs are weaker in comparison to Sony's, nor will they openly admit that they misled the public (neogaf) or lied.

It is beating a dead horse, however, Microsoft keeps restarting this conversation (be it from PR, tweets, posts here and whatnot) and we keep going around in circles.
 

Vizzeh

Banned
I appreciate him taking the time to post and interact with the comnunity, but he's doing more harm than good with everything he says making headlines daily (and ALL of it negative in nature). If Microsoft was smart they'd shut him down before he does even more harm.

Most industry folks get PR training before they're allowed to do interviews or post, and his recent posts are prime examples of engagements that should be avoided at all costs.

The thing is, PR is what I hate, the complete uninformative manner behind it, Albert is a nice guy, you can tell, he is well mannered and polite, especially under massive scrutiny, what we do know though is that how he worded his facts seemed twisted, that was his negative, he didnt explain how he got his numbers, of which seemed quite dubious by most. It may have even been an attempt to get a number spin on the headlines, hoping many would soak it up, lucky most guys at Gaf are not yes-men :)
 

iNvid02

Member
albert: yeah hey guys, looks like we had some communication problems once again. those god damn communication wires get muddled all the damn time... i know right
 

RedAssedApe

Banned
should have just gave the standard...we do not comment on rumors or speculation PR response lol

that or just lurked GAF for info gathering purposes and not posted at all.
 

p3tran

Banned
I feel bad for Albert. I shouldn't, but I do.
dont. it was his choice.

and anyway, the judge of his credibility will not be someone who answers (reasonably) "i dont know about that yet".

E-V-E-R-Y-T-H-I-N-G will be calculated and analyzed way before this generation closes its first year.

they (both) just have to bring out their machines. both delayed, both underpowered for the next five years.
 

Freki

Member
I feel bad for Albert. I shouldn't, but I do.

Why? What he posted wasn't a mistake - it was an attempt to spread FUD. He did so knowing and willingly.
If you google for FUD and Microsoft you'll see that this is one of their oldest tricks in the PR playbook...
 
How reliable is Ars when it comes to understanding this kind of stuff though?
Extremely. Technology is their original expertise and specialty.

That this type of response is coming from Peter Bright (Ars's own MS reporter, who is a huge MS fanboy by his own admission and uses a Windows Phone) and Kyle Orland (their games editor) says a lot.

Ars's readership also heavily prefers MS and hates Sony, so there's that, too.
 

p3tran

Banned
Extremely. Technology is their original expertise and specialty.

That this type of response is coming from Peter Bright (Ars's own MS reporter, who is a huge MS fanboy by his own admission and uses a Windows Phone) and Kyle Orland (their games editor) says a lot.

Ars's readership also heavily prefers MS and hates Sony, so there's that, too.

pfff doesn't hold a candle to OUR Kyle :D
 

Ishan

Junior Member
yeah albert was doing a good job till the hardware stuff he started posting ... eh comes with the job i guess. I do wonder how many people get fooled by this. I know when I was younger I just believed all the pr stuff didnt know what it all meant and went like ooh soo many teraflops blah blah awesome need new console.
 

Garryk

Member
let's get everyone from MS here.

I'm just imagining a room full of MS employees with a Neogaf reply on the big screen, carefully wordsmithing a post.

I can imagine this because I've been in meetings where someone is composing an e-mail on the projector screen getting group feedback.
 
So what the hell is going to happen with multiplats then?

Albert's continual push that differences will look negligible almost makes me think the tinfoil hat theory of MS buying multiplat parity is not so far-fetched
 
Man so many numbers being thrown around the past few months it's been hard to keep track.

http://www.blogcdn.com/www.engadget.com/media/2013/07/sony-ps4-dev-kit-specs.jpg[/.iMG]
Was that not talking about the CPU?[/QUOTE]

No. It was the max frequency for any chip, transmitter, or receiver within the PS4 devkit. Including WLAN, Bluetooth, whatever else.

[quote="SwiftDeath, post: 81287589"]So what the hell is going to happen with multiplats then?

Albert's continual push that differences will look negligible almost makes me think the tinfoil hat theory of MS buying multiplat parity is not so far-fetched[/QUOTE]
MS might be able to negotiate that devs keep the same quality of assets (Models and textures) but it would take some real work to make the performance any worse on PS4 than on XBOne. They might be able to lock the Xbox One and PS4 versions to the same framerate, but whereas one version would see dips in performance (XBO), the other (PS4) wouldn't
 

Gestault

Member
As someone who derives a big part of his enjoyment of the gaming hobby from the tech-side of things, it drives me crazy when a company tries to market less powerful hardware as being more powerful. I'm glad they put together this piece.
 

p3tran

Banned
So what the hell is going to happen with multiplats then?

Albert's continual push that differences will look negligible almost makes me think the tinfoil hat theory of MS buying multiplat parity is not so far-fetched
nfs does not look worse than driveclub
 

Bundy

Banned
Full of shit as expected
Exactly!
Penello: "We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted."

Ars: "Just adding up bandwidth numbers is idiotic and meaningless. While the Xbox One's ESRAM is a little faster, we don't know how it's used, and the PS4's GDDR5 is obviously a lot bigger."
Wait!
Penello really said that?
Bwhahahaha!
 
I know, I was surprised they didn't assert that the cloud would make up for the difference. I thought for sure they would. :'/

That was during the time they still had all their pre-180 policies and had other main focuses in place and they didn't really care for the superior power and even said they didn't target the highest graphics on purpose. Now after the 180 the console is basically a weaker and 100$ more expensive alternative to the PS4 and all the cloud power has been debunked as marketing speech it's understandable they can't rest, knowing that people are aware MS' console is weaker.
 

coolasj19

Why are you reading my tag instead of the title of my post?
Currently it is 1.6 until Sony says otherwise. Just like Microsoft confirmed 1.75

Hypothetically, if Sony did up it to the same speed, what would be the best spin? I like to think the "We didn't announce anything before, which means those were all rumours. This is offical" would fly the best.
 
The sooner AMD and MS release the full rundown on the final tech, the better. If what was said is true, we have to endure just a few more weeks of very predictable tech speculation threads and tech site hit-generators. They need to directly interview with all of these tech sites when they do, otherwise we're gonna have more thread-derailing, conversation-stifling accusations of bias or unreliability if not straight bullshit.
 
Penello: "We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted."

Haha, it's up to 272gb/sec now?
MS really should reign in their PR dudes, they're starting to look really ridiculous.
 
This thread is going to be great! Hopefully the mods do not close this one because this thread is pretty important to truly show what type of confusion Microsoft's PR Is causing and just how their numbers do not add up at all.



This has to be really embarrassing when Albert says these numbers came straight from Technical Fellow, and he talked up Technical Fellow as being so intelligent and smart. Plus, there were even people in that thread saying they would rather believe Albert and Technical Fellow instead of GAF Users, however, the ARS statements are nearly exactly what the GAF Users said.



Its also very interesting to go back and read Albert's defense of these numbers, and stand behind Technical Fellows specs/numbers/statements, even though now it is pretty honest that Technical Fellow had no idea what he was talking about.



We all know Albert is out "rounding up answers" again, so I am interested in what he has to say.
 
Penello: "18 CUs [compute units] vs. 12 CUs =/= 50% more performance. Multi-core processors have inherent inefficiency with more CUs, so it's simply incorrect to say 50% more GPU."

Ars: "The entire point of GPU workloads is that they scale basically perfectly, so 50% more cores is in fact 50% faster."

I highly doubt this is true. If so AMD/Nvidia would just keep adding CU's to cards, instead of increasing bw, ROPS etc. Why consider bandwidth at all if increasing CU's gives perfect scaling?
Penello: "We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted."

Ars: "Just adding up bandwidth numbers is idiotic and meaningless. While the Xbox One's ESRAM is a little faster, we don't know how it's used, and the PS4's GDDR5 is obviously a lot bigger."

The ESRAM and DDR3 work simultaneously so you CAN add the bandwidths. It is not inaccurate. Meaningless stat though.

Penello: "We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles."

Ars: "Maybe true."

As far as we know it is 100% true.

Penello: "Speaking of GPGPU—we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU."

Ars: "I don't know if that's even true."

Why not try to find out?
 

derFeef

Member
The sooner AMD and MS release the full rundown on the final tech, the better. If what was said is true, we have to endure just a few more weeks of very predictable tech speculation threads and tech site hit-generators. They need to directly interview with all of these tech sites when they do, otherwise we're gonna have more thread-derailing, conversation-stifling accusations of bias or unreliability if not straight bullshit.

That already happened at Hotchips, we have slides from AMD/MS and everything I thought?
 

Proelite

Member
Penello is a marketing person, not a "fanboy". He is trying to sell us a product by any means necessary, that includes misrepresentation of facts and bizarre misdirection (like his "we invented Direct Compute/DirectX" comment).

It's also why he is trying to "buddy up" on the forums.

Making posts in a thread on Neogaf is a huge waste of effort if he is simply marketing. In addition, he posts during off-work hours.

I don't think there was a specs written by his team for "conquering" Neogaf, nor do I think he is compelled to post anything at all.
 

Josh7289

Member
I highly doubt this is true. If so AMD/Nvidia would just keep adding CU's to cards, instead of increasing bw, ROPS etc. Why consider bandwidth at all if increasing CU's gives perfect scaling?

Because it costs more space and heat to keep adding CUs? (I don't really know, but I could see that being true)
 

Sean*O

Member
OK Microsoft, time to send out your next silver tongued PR guy because this one didn't get much farther than the ones before him.
 

vpance

Member
I imagine Penello spoke to his technical fellow for the first time the other day.

Dude must've been scared for his job and just told him how powerful it was and hoped he's go away, lol.
 

Kysen

Member
I really wish we could get some Sony leaks on here, we have heard nothing from behind the scenes since February.
 
nfs does not look worse than driveclub

I'm not really sure how that's relevant?

Far more relevant will be how the same game looks on different systems as they will be developed by the same people (for the most part) with the same assets, engines etc.
 
Exactly!

Wait!
Penello really said that?
Bwhahahaha!

What's even better is one post in that old thread
http://www.neogaf.com/forum/showpost.php?p=80986853&postcount=949 said:
And please allow me to help out your PR department a little. 204gb/sec, according to your understanding of the number, actually implies the old clock rate of 800mhz. The new number should be 853 * 128 * 2 (simultaneous read/write per cycle) = 218gb/sec. They can thank me later.
 
Top Bottom