• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ars Technica: Penello's XB1 Numbers, Ars "Sniff Test" Analysis

Freki

Member
Wifi on the 2.4GHz band is 2.412 to 2.452GHz. The clock frequency listed in the FCC filling for the Dev kit refers to something else (probably not the APU as Jaguar cores only go up to 2GHz).

It's the maximum possible frequency of that particular device - I suppose the Wifi chip can go a little bit higher then the freq defined in the standard...
edit: or this
didn't think of it - lol
And I guarantee everyone that this freq has nothing to do with CPU or GPU - no secret sauce - sry :-D...
 
Where's my penny arcade comic?

Those guys would never be that harsh to Microsoft. They might not get invited to the next Microsoft event, or have dinners out with high ranking Xbox guys so often if they did.

I remember right after the XB1 reveal event Brian Crecente mentioning he ran into the Penny Arcade guys "and they loved it".
 

GameSeeker

Member
Albert's track record on the "technical facts" is very poor. He should stick to things he knows like the name of the product, it's price and ship date. All important marketing stuff.
 
bEJNL1n.gif

This is Microsoft in regards to the Xbone.

Cute!

But what Microsoft is doing is in no way, shape, or form cute.
 

Atilac

Member
The ESRAM and DDR3 work simultaneously so you CAN add the bandwidths. It is not inaccurate. Meaningless stat though.
No you cannot add them, if you can you might as well add the bandwidth for each separate memory model in the PS4, since each module is being accessed at the same time.
 

Duxxy3

Member
Albert's track record on the "technical facts" is very poor. He should stick to things he knows like the name of the product, it's price and ship date. All important marketing stuff.

Or just as important... why the hell is it $100 more?!?
 

Metfanant

Member
What I also find funny about MS and there stance on what makes the console better (more efficient) is that they are now saying things like "CUs are inherently inefficient " and how their engineers are so smart...

Are they honestly trying to convince everyone that Sony doesn't have smart engineers? And that Sony somehow wouldn't have access to the same knowledge as them? Cerny has talked at length about utilizing the "holes" in computational cycles to perform other tasks...it's OBVIOUS to anyone with a brain, that while certain things MS (Albert) is saying are true, Sony has already thought of this as well and has implemented things to counteract it...

Cerny even said they researched using an architecture very similar to the Xbone's (with even higher bandwidth numbers) but decided against it...
 

besada

Banned
You guys are just having trouble imagining how the less powerful console could, in actuality, be the more powerful console.

Which is weird, since we've seen that play out repeatedly this generation, as the 360 beat the Cell-powered PS3 in most multiplats.
 
No you cannot add them, if you can you might as well add the bandwidth for each separate memory model in the PS4, since each module is being accessed at the same time.


Add internet bandwidth and cloud computing and you practically get a system that scales in power as internet speeds increase!
 
Those guys would never be that harsh to Microsoft. They might not get invited to the next Microsoft event, or have dinners out with high ranking Xbox guys so often if they did.

I remember right after the XB1 reveal event Brian Crecente mentioning he ran into the Penny Arcade guys "and they loved it".

They don't have the subtlety or nuance to pull it off.
 
Granted it was in 1989 he has a degree in communications. :) Major Nelson has been blogging, tweeting and working the social media angle for Xbox for awhile now. I wouldn't be surprised if he doesn't post in any of these threads. He does seem to post in /r/xboxone though. But probably because they worship him there lol

Color me shocked.

Well, my point still stands. Social media is a total different beast that the traditional "communications" of old. Its far too easy to be called out on bullshit in this day and age.

Even Sony's "emotion engine" and Sega's "blast processing" wouldn't fly in todays day and age.

Apple barely gets away with "Retina display"
 

FINALBOSS

Banned
I think this one:


is the only one that'll give them an advantage, mainly the audio chip. I remember reading audio takes a lot of cpu cycles/power. PS4 still more powerful, though. As a casual observer, the 50% less CUs part just seems like a dagger, maybe not through the heart but through a kidney or something.

Keep in mind that their audio chip is only that beefy because of Kinect.
 

Metfanant

Member
Which is weird, since we've seen that play out repeatedly this generation, as the 360 beat the Cell-powered PS3 in most multiplats.
Because we all (most?) understand that those differences are due to architectural designs...not necessarily computational power..
 
So why is this Ars article taken as fact and not ERP who is a known dev?

Because this ERP has been constantly downplaying the PS4 since its announcement? I haven't one positive post of his posted here yet.

That said i don't read Beyond3d past what is posted here.

And what Sony devs are based in redmond anyway?
 

johnny956

Member
So why is this Ars article taken as fact and not ERP who is a known dev?

CU's are MASSIVELY underutilized on vertex heavy workloads and plenty of the frame will be ROP or bandwidth limited

Doesn't the PS4 has ROP and bandwidth advantages too? So either way he's saying its going to be a advantage for the PS4 just don't go by the CU's alone.
 
Sony is like a prude, they know how to keep their lips shut.
Which is actually kind of weird, considering the leaks we got from them during the PS3 era.

It almost makes you think that all those "leaks" were deliberate red-herrings to keep people away from finding out about PS4 early.
 

gruenel

Member
What!!!!

Is adding more CU cores to the GPU a bad thing now? Is that the new MS spin?

Dear God, didn't see this coming!

That actually came up a few weeks ago.

I remember there was a discussion about how 18 CUs are "unbalanced" and shit like that.
 
That already happened at Hotchips, we have slides from AMD/MS and everything I thought?

There seems to be a lot of missing info and, conspicuously, no information on the major MS-designed silicon inside of the system which governs the operation of the GPU and the ESRAM. They did not really explain the intention behind memory architecture and its focus on so much local memory and how it is supposed to work with all of the other unexplained, but leaked stuff inside. You know, all of those fifteen specific co-processors and fixed function chips that MS threw into the design.

Personally, I think MS went with a Frankenstein to plug all possible holes in the boat. I'm guessing that they profiled and analyzed all of the parts of the pipeline where fixed-function chips could more effectively bear the brunt of so many tasks normally given to the general purpose CPU and perhaps some of the more demanding GPU processes. You know, claw back a lot of programmable resource (from what seems like tablet-class hardware) by using more cheap, specialist logic to give the bigger chips more headroom. A kitchen full of cooking staff is better than just a couple of chefs doing it all on their own, or so I imagine their vision is. The memory config seems bent on keeping system-wide bandwidth higher between parts, increasing utilization with less chance for performance-killing contention for memory access, and not just a bigger straw from external memory. At least, that's what everything revealed so far seems to paint. Given the mystery surrounding the custom MS chips inside, the special sauce could be MS essentially having built their own stab at stepping up with regard to the GCN capability roadmap, giving the X1 a more fully-featured ability to have hardware help maintain coherence and offer something next on their own Direct X roadmap. I guess it could make total sense if AMD had some of their next-gen APU IP in that chip, stuff that would have been part of the next GCN level. Would fit the history of Xenos and X360 and how MS got dibs on a relatively early look at the next wave of tech. Is this all just a happy dream? Perhaps, but it's more fun to speculate than come down hard on the same disjointed pieces of the puzzle when talking about what the real thing has and does and why. I'm sure the truth is far more mundane and simple.
 

RoboPlato

I'd be in the dick
Penello's Technical Fellow fed him bullshit and hoped we wouldn't figure it out. Surprised Ars ripped it apart that bluntly.
 
This guy always sounds so abrasive; it's like he's a fucking valley girl or something.

"Uh, yah, we totally put out GPUPU with the Kinect. It's like so 2010."
 

Skeff

Member
Aren't a few of them locked away for GPGPu use though?

no.

Quick note for everyone here saying CU's aren't the only thing in GPU's that matter as they need more ROP's etc, The CU count is the smallest advantage % wise the PS4 has over the xbone's GPU AFAIK, for example the ROPS are a 100% increase not a 50% increase.
 
Yet, as bad as they have been at PR, it will be a total shame that a half billion dollar ad campaign will wash it all away from memory.

Surface and Windows phone 8 that even extensive marketing cant save a undesirable product.

Luckily the Xbox one is still one, even in its original form.

More desirable than the PS4? time will tell.
 

Atilac

Member
Quick internet search: the PS4 has 16 GDDR5 memory modules, each module has 48GB/s of bandwidth, for a combined total of 768 GB/s. Albert's math is pretty awesome when you use it both ways.
 

Metfanant

Member
Aren't a few of them locked away for GPGPu use though?
No, from what I gather (and someone correct me if im wrong) a few of them(4?) have a little secret sauce (lol) that allows for some additional compute voodoo...this is where the whole 14+4 unbalanced nonsense came from...

Btw: it's fun throwing things like secret sauce and voodoo after technical terms...makes them seem cooler...no wonder MS does it
 

velociraptor

Junior Member
Surface and Windows phone 8 that even extensive marketing cant save a undesirable product.

Luckily the Xbox one is still one, even in its original form.

More desirable than the PS4? time will tell.
The only thing Sony will need to advertise is the fact it is $100 cheaper. While the Xbox has a big fanbase, I suspect it's high price will deter many.
 
So wtf did he "develop" on the PS4!?

Who knows? You can be a PS4 developer and take another job not in PS4 development and still be a former PS4 developer, though.

Just because he left the company before the console's release does not mean he should lose credit for any of the work he actually did during the development process of whatever project he was attached to.
 

Metfanant

Member
He used to develop a game on PS4? What's so hard to understand about it?
And he left mid development? Real team player...sounds like a genuine dude

Who knows? You can be a PS4 developer and take another job not in PS4 development and still be a former PS4 developer, though.

Just because he left the company before the console's release does not mean he should lose credit for any of the work he actually did during the development process of whatever project he was attached to.
No...but it brings about questions to his motives...that's all...
 

artist

Banned
So wtf did he initially "develop" on the PS4!?...considering, ya know, just about all of it (on the software side) is still being developed?
What ERP said doesnt support Penello's argument at all .. it's just that some people are trying to use it as a leverage.
 
Top Bottom