• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Albert Penello puts dGPU Xbox One rumor to rest

Status
Not open for further replies.

rjinaz

Member
Instead of twisting numbers, you should just show your first party games a bit more (FM5 in particular has had very little media shown - only two tracks two months before the release?) and some popular third party games running on your box (FIFA, COD, BF4 etc.) to instill people's confidence that you're going to provide a capable machine, regardless of what Sony are doing.

I agree. They should really just give up on the graphics argument at this point, it's a losing battle. Instead they need to try and convince me and others while, despite having lesser graphics, theirs is the box for me and not the graphically better station.

I think I would try something like this:

We concede that the PS4 does have slightly better specs. But XBox One offers entertainment in ways that go beyond the graphical capabilities of a machine. Like the Kinect (explain), like the TC connectivity (explain), like our first party titles (explain).

We at Microsoft believe that graphics can only take you so far in video game entertainment and that there are better ways to improve user experiences.


Then back it up with things like Oculus Rift Support and Forza glasses...

Edit: Err well not Forza ha, whatever those glasses are called.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
So it's still 20GB/s for PS4 vs 30GB/s Xbone? Or am I misunderstanding something?

According to the available figures, that is correct. The slide from vgleaks seems to be outdated, invalid, or misunderstood since quotes from Cerny give the Onion bus 20GB/s not 10/GB/s. Maybe it's 10GB/s "either way" simultaneously which would accumulate to 20GB/s in sum.

This bandwidth is, in both systems, cache-coherent in the sense, that the GPU always "sees" data that the CPU sees without the need of explicit synchronization by flushing the cashes. And both systems seem to behave the same way in this respect. For the CPU to "see" data that the GPU sees, both systems have to flush cashes. The PS4 can selectively flush individual cash lines while the XB1 apparently needs to flush the entire GPU cache. That is the evidence so far.
 

shandy706

Member
We're all "Technical Fellow" material here.

You would think so. Seems half of us are apparently working on PS4 and X1 hardware to the point that they know how every game will perform in the real world once released.

I'm not even going to pretend I know anything beyond what's on paper. The PS4 has better specs based on what I understand. Will it result in negligible framerate/resolution/graphics differences, or will it be the night and day some seem to think?

I'll just wait and see once the consoles come out. I won't care really (PC for cross-platform), but it will be fun to watch some eat crow.

If its 4-5fps and 2xAA vs 4xAA,I'll laugh at those that said "huge gap".

If it's 30fps and 2xAA vs 8xAA(or some better form of AA) and/or crazy post processing, I'll laugh at the they're close people.
 

onQ123

Member
I see my statements the other day caused more of a stir than I had intended. I saw threads locking down as fast as they pop up, so I apologize for the delayed response.

I was hoping my comments would lead the discussion to be more about the games (and the fact that games on both systems look great) as a sign of my point about performance, but unfortunately I saw more discussion of my credibility.

So I thought I would add more detail to what I said the other day, that perhaps people can debate those individual merits instead of making personal attacks. This should hopefully dismiss the notion I'm simply creating FUD or spin.

I do want to be super clear: I'm not disparaging Sony. I'm not trying to diminish them, or their launch or what they have said. But I do need to draw comparisons since I am trying to explain that the way people are calculating the differences between the two machines isn't completely accurate. I think I've been upfront I have nothing but respect for those guys, but I'm not a fan of the mis-information about our performance.

So, here are couple of points about some of the individual parts for people to consider:

• 18 CU's vs. 12 CU's =/= 50% more performance. Multi-core processors have inherent inefficiency with more CU's, so it's simply incorrect to say 50% more GPU.
• Adding to that, each of our CU's is running 6% faster. It's not simply a 6% clock speed increase overall.
• We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted.
• We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles.
• We understand GPGPU and its importance very well. Microsoft invented Direct Compute, and have been using GPGPU in a shipping product since 2010 - it's called Kinect.
• Speaking of GPGPU - we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU.

Hopefully with some of those more specific points people will understand where we have reduced bottlenecks in the system. I'm sure this will get debated endlessly but at least you can see I'm backing up my points.

I still I believe that we get little credit for the fact that, as a SW company, the people designing our system are some of the smartest graphics engineers around – they understand how to architect and balance a system for graphics performance. Each company has their strengths, and I feel that our strength is overlooked when evaluating both boxes.

Given this continued belief of a significant gap, we're working with our most senior graphics and silicon engineers to get into more depth on this topic. They will be more credible then I am, and can talk in detail about some of the benchmarking we've done and how we balanced our system.

Thanks again for letting my participate. Hope this gives people more background on my claims.

How do you know this? you work at MS & Sony hasn't even released the clock speed of their CPU & they haven't released the specs of their Audio Chip & what about their Secondary Custom Chip?


IMG_3023.jpg


How do you know that this chip isn't taking a big load off of the PS4 CPU/GPU when it come to stuff that the Xbox One will need to use CPU/GPU resources for? like the OS , Second Screen , Video & Voice Chat , Gameplay Recording , Gameplay Streaming & so on?


is their a secondary processor in the Xbox One to help handle with the OS?
 

KidBeta

Junior Member
According to the available figures, that is correct. This bandwidth is, in both systems, cache-coherent in the sense, that the GPU always "sees" data that the CPU sees without the need of explicit synchronization by flushing the cashes. And both systems seem to behave the same way in this respect. For the CPU to "see" data that the GPU sees, both systems have to flush cashes. The PS4 can selectively flush individual cash lines while the XB1 apparently needs to flush the entire GPU cache. That is the evidence so far.

It should be noted that to flush the entire cache to eSRAM takes 4096 cycles.
 
Last post? Not even close. Sorry - still many more meme's yet to come.

As was kindly suggested by someone in a PM, it's unlikely anything more I'm going to post on this topic will make it better. People can believe me and trust I'm passing on the right information, or believe I'm spreading FUD and lies.

When I first started coming on, I said what I wanted to do was speak more directly and more honestly with the community, clarifying what we could because you guys have more detailed questions then we had been dealing with.

Regarding the power, I've tried to explain areas that are misunderstood and provide insight from the actual engineers on the system. We are working with the technical folks to get more in-depth. As I said - they are more credible then I am, and can provide a lot more detail. Best I leave it to them.

Next stop - launch itself. Only then, when the games release and developers will inevitably be asked to compare the systems, will we there be a satisfying answer.

Until then, as I have been, I'll try and answer what I can. But I'm not going to add more on this topic.
 

Barzul

Member
If there is not a clear obvious difference Day 1, I'll be skeptical of the power differential being as big as is being purported. It should not take 2-3 years to show a 50% power differential, I'm not buying that especially as the consoles are running off the same architecture. If I buy a GPU from the same company, that is 50% better than what I currently have the difference would be noticeable once I crank up the settings. 3 things I've been led to consider as fact.

-PS4 has better GPU
-PS4 has faster RAM
-PS4 is easier to develop for


If those 3 things are true then the PS4 should have the clear performance advantage on Day One. These consoles are essentially PC's in their architecture right? Why would it need to take that long to see an advantage? Developers have had devkits for months now, if they can make PC versions of games blow away console versions, then since both consoles and PC's are going off similar architectures, why can't the same happen on both consoles?
 

Andeeeh

Member
At Microsoft, we have a position called a "Technical Fellow" These are engineers across disciplines at Microsoft that are basically at the highest stage of technical knowledge. There are very few across the company, so it's a rare and respected position.

We are lucky to have a small handful working on Xbox.

I've spent several hours over the last few weeks with the Technical Fellow working on our graphics engines. He was also one of the guys that worked most closely with the silicon team developing the actual architecture of our machine, and knows how and why it works better than anyone.

So while I appreciate the technical acumen of folks on this board - you should know that every single thing I posted, I reviewed with him for accuracy. I wanted to make sure I was stating things factually, and accurately.

So if you're saying you can't add bandwidth - you can. If you want to dispute that ESRAM has simultaneous read/write cycles - it does.

I know this forum demands accuracy, which is why I fact checked my points with a guy who helped design the machine.

This is the same guy, by the way, that jumps on a plane when developers want more detail and hands-on review of code and how to extract the maximum performance from our box. He has heard first-hand from developers exactly how our boxes compare, which has only proven our belief that they are nearly the same in real-world situations. If he wasn't coming back smiling, I certainly wouldn't be so bullish dismissing these claims.

I'm going to take his word (we just spoke this AM, so his data is about as fresh as possible) versus statements by developers speaking anonymously, and also potentially from several months ago before we had stable drivers and development environments.

I am sure there are brilliant software and hardware engineers at MS, but your not doing these great people any favors.
 
Are you kidding? I mean really - are you kidding?

This is part and parcel of the territory here. You have to answer for your statements, especially if you're here in an official capacity. People get banned for being out of line, but poking holes in the arguments of other posters is well within the rules.

I've had my work here both praised and eviscerated, called out by numerous forum folks both publicly and via PM when I got stuff wrong, and I'm a goddamn admin. Guess what - I wouldn't have it any other way. That is what makes NeoGAF what it is.

There are many, many people who are more than capable of assessing, vetting and debunking technical claims and they have every right to do so. That's the price of doing business here. If we had official Nintendo or Sony reps on board, they would be subject to the same process.

If you're scared, buy a dog.

I don't have a problem with that, being called out is fine but the passive aggressive behaviour from some I think is a bit overboard. There's ways to do it in a more calm approach. I have no problem whatsoever with people taking issue at his claims. It's this attitude which has lead to many closed threads over the past few days.
 

SPDIF

Member
Sounds pretty respectable to me.

Yeah, and regardless of the name, there's no doubt that if you're able to get that title, you're a genius as far as computer science is concerned.

This isn't the technical fellow that Albert was talking about, but one of them who's working on the Xbox One is Dave Cutler
 

Pain

Banned
I think Albert severely underestimated how smart NeoGaf is. We are not dumb. We don't need to be Microsoft engineers to figure a lot of this stuff out.
 

R3TRODYCE

Member
Last post? Not even close. Sorry - still many more meme's yet to come.

As was kindly suggested by someone in a PM, it's unlikely anything more I'm going to post on this topic will make it better. People can believe me and trust I'm passing on the right information, or believe I'm spreading FUD and lies.

When I first started coming on, I said what I wanted to do was speak more directly and more honestly with the community, clarifying what we could because you guys have more detailed questions then we had been dealing with.

Regarding the power, I've tried to explain areas that are misunderstood and provide insight from the actual engineers on the system. We are working with the technical folks to get more in-depth. As I said - they are more credible then I am, and can provide a lot more detail. Best I leave it to them.

Next stop - launch itself. Only then, when the games release and developers will inevitably be asked to compare the systems, will we there be a satisfying answer.

Until then, as I have been, I'll try and answer what I can. But I'm not going to add more on this topic.
Good move.
 

Bowler

Member
I agree. They should really just give up on the graphics argument at this point, it's a losing battle. Instead they need to try and convince me and others while, despite having lesser graphics, theirs is the box for me and not the graphically better station.

I think I would try something like this:

We concede that the PS4 does have slightly better specs. But XBox One offers entertainment in ways that go beyond the graphical capabilities of a machine. Like the Kinect (explain), like the TC connectivity (explain), like our first party titles (explain).

We at Microsoft believe that graphics can only take you so far in video game entertainment and that there are better ways to improve user experiences.


Then back it up with things like Oculus Rift Support and Forza glasses...

Thats a tough pill to swallow for your corporation when you are more expensive then the competition
 

K' Dash

Member
Are you kidding? I mean really - are you kidding?

This is part and parcel of the territory here. You have to answer for your statements, especially if you're here in an official capacity. People get banned for being out of line, but poking holes in the arguments of other posters is well within the rules.

I've had my work here both praised and eviscerated, called out by numerous forum folks both publicly and via PM when I got stuff wrong, and I'm a goddamn admin. Guess what - I wouldn't have it any other way. That is what makes NeoGAF what it is.

There are many, many people who are more than capable of assessing, vetting and debunking technical claims and they have every right to do so. That's the price of doing business here. If we had official Nintendo or Sony reps on board, they would be subject to the same process.

If you're scared, buy a dog.

ibrhYqRKhFz5bo.gif


My take on this, if you can't compete in tech (not saying XB1 can't, I really don't know much about it), focus on what really matters: TEH GAEMZ
 

KidBeta

Junior Member
Last post? Not even close. Sorry - still many more meme's yet to come.

As was kindly suggested by someone in a PM, it's unlikely anything more I'm going to post on this topic will make it better. People can believe me and trust I'm passing on the right information, or believe I'm spreading FUD and lies.

When I first started coming on, I said what I wanted to do was speak more directly and more honestly with the community, clarifying what we could because you guys have more detailed questions then we had been dealing with.

Regarding the power, I've tried to explain areas that are misunderstood and provide insight from the actual engineers on the system. We are working with the technical folks to get more in-depth. As I said - they are more credible then I am, and can provide a lot more detail. Best I leave it to them.

Next stop - launch itself. Only then, when the games release and developers will inevitably be asked to compare the systems, will we there be a satisfying answer.

Until then, as I have been, I'll try and answer what I can. But I'm not going to add more on this topic.

512KB at 128 bytes / clock = 4096 clocks. Pretty simple maths man.
 

benny_a

extra source of jiggaflops
I said I wouldn't post more but the situation seems defused now.

I'll preface this post by saying that, having never owned a Sony console and thus not become invested in any of their franchises, I'm an Xbox fan. I've preordered an Xbox One and am very happy with that decision. That said, I'm fully willing to admit that the PS4 is a more technically capable machine on paper. I'm just not sure that the difference will manifest itself at launch.

When Penello says that the difference is not as great as you might think, he's referring to right now. Developers are still learning the intricacies of both systems. It makes sense that they'll be pretty comparable during the early stages. Heck, I'm pretty sure Sony itself has said it will likely be a couple years before the true potential of the PS4 is capitalized upon. As for right now, I'm content waiting for the actual launches before making assumptions about wide gaps in performance.
Why people like myself argue that there will be a difference at launch is because there haven't been consoles that were as similar as these ones before.

One has straight up more power than the other. The Xbox One is at an architectural disadvantage due to their memory setup. If we assume the SDKs are at the same maturity level then it's easier to get better performance out of a PS4 than a Xbox One.

That the PS4 has optimizations past the launch games that are arguably better than the Xbox One doesn't even enter the equation for launch.
 
Last post? Not even close. Sorry - still many more meme's yet to come.

As was kindly suggested by someone in a PM, it's unlikely anything more I'm going to post on this topic will make it better. People can believe me and trust I'm passing on the right information, or believe I'm spreading FUD and lies.

When I first started coming on, I said what I wanted to do was speak more directly and more honestly with the community, clarifying what we could because you guys have more detailed questions then we had been dealing with.

Regarding the power, I've tried to explain areas that are misunderstood and provide insight from the actual engineers on the system. We are working with the technical folks to get more in-depth. As I said - they are more credible then I am, and can provide a lot more detail. Best I leave it to them.

Next stop - launch itself. Only then, when the games release and developers will inevitably be asked to compare the systems, will we there be a satisfying answer.

Until then, as I have been, I'll try and answer what I can. But I'm not going to add more on this topic.

Honest question - how about an AMA with your tech guy? It could be very informative.
 
With all due respect, I don't believe Albert. Compare the 7850 (1.76TF) with the 7770 (1.28TF). The performance difference is staggering. Now both the PS4 and Xbox One have better GPUs than the aforementioned (1.84 vs 1.31). However, the gap between the two console GPUs is greater than the gap between the two AMD GPUs that I have just mentioned.

Here is badb0y's writeup:
http://www.neogaf.com/forum/showpost.php?p=74541511&postcount=621

I do not see how Microsoft will mitigate that gap unless they're using some sorcery.

Gpus are really complex designs for one to put the performance differences solely on compute units.

Brute processing power is often enough the cheapest resource you have... The one you actually sacrifice to make up for a lack of a more expensive one (Usually memory). At a certain degree even for Ps3 and 360 games that still holds true. Just look at what gets downgrade on them compared to Pc ports. Things that are solely dependent on processing power are actually the closest (at least visually) to their Pc counter parts.

Yeah, I know this time around the architectures are a lot closer than last gen, but increasing processing power is not the only way to increase performance. Actually, if you look at high performance heterogeneous designs from Nvidia and Intel they are actually moving away from brute forcing performance gains by adding more processing power to increase the amount of in chip memory, so overall performance is increased while costs (be it transistor count, or power consumption) are more controlled.

Also, Xbone is developed by the same team that made the 360. I trust them to know what they are doing. And if you happen to watch any of the Gamesfest presentations it's clear that they really understand what their hardware does, what actually affects performance and how you can improve it. The 360 devkit had so many debugging flags and performance counters that even fairly recently not even NVidia dev tools offered the same insight of where your performance is going.

And how a processing power gap can be close, you ask? It's not magic. More processing power means the hardware can process more numbers at the same time. if those numbers are not there to be processed more processing power won't do a thing to increase your performance. That means that tasks that are not flop bound are not going to be automatically faster, and that also means that if some task, like gathering the numbers to be processed, are much slower on one processor than the other, the processing number advantage might also not translate into actual performance. Once you thing on the system as a whole, you realize that there's a lot more to get the desired performance than just raw processing power. And you also realize there are lots of different ways were performance can be improved.
 
I don't have a problem with that, being called out is fine but the passive aggressive behaviour from some I think is a bit overboard. There's ways to do it in a more calm approach. I have no problem whatsoever with people taking issue at his claims. It's this attitude which has lead to many closed threads over the past few days.

People who know about tech feel completely intellectually insulted by his posts. That much is clear. Your defense of him seems a bit overboard and defensive.
 

Bitmap Frogs

Mr. Community
We concede that the PS4 does have slightly better specs. But XBox One offers entertainment in ways that go beyond the graphical capabilities of a machine. Like the Kinect (explain), like the TC connectivity (explain), like our first party titles (explain)


I do think not having a showcase of what Kinect 2 can do is hurting them. I mean "Xbox TV" and "Xbox go home" is cool and all that, but not a Kinect-seller, at least for the types that frequent internet gaming forums.

Moving Ryse from a Kinect game to a gamepad game might have been a terrible mistake. Not only Ryse is becoming the butt of all jokes but it also deprived them for a Kinect flagship.
 
D

Deleted member 80556

Unconfirmed Member
Last post? Not even close. Sorry - still many more meme's yet to come.

As was kindly suggested by someone in a PM, it's unlikely anything more I'm going to post on this topic will make it better. People can believe me and trust I'm passing on the right information, or believe I'm spreading FUD and lies.

When I first started coming on, I said what I wanted to do was speak more directly and more honestly with the community, clarifying what we could because you guys have more detailed questions then we had been dealing with.

Regarding the power, I've tried to explain areas that are misunderstood and provide insight from the actual engineers on the system. We are working with the technical folks to get more in-depth. As I said - they are more credible then I am, and can provide a lot more detail. Best I leave it to them.

Next stop - launch itself. Only then, when the games release and developers will inevitably be asked to compare the systems, will we there be a satisfying answer.

Until then, as I have been, I'll try and answer what I can. But I'm not going to add more on this topic.

Then you didn't do anything and have left us in the same place as we were before.

The fact that you're not commenting on this is worrisome.

Try us. I mean it. People will understand, we're trying to understand, and you're not helping.

EDIT: Ask your superiors if you can have that fellow coming here to answer our questions, because people here do know (even if some people like to think contrary to that) and if it checks out, more respect to you and we learn something new. We all win.
 
Last post? Not even close. Sorry - still many more meme's yet to come.

As was kindly suggested by someone in a PM, it's unlikely anything more I'm going to post on this topic will make it better. People can believe me and trust I'm passing on the right information, or believe I'm spreading FUD and lies.

When I first started coming on, I said what I wanted to do was speak more directly and more honestly with the community, clarifying what we could because you guys have more detailed questions then we had been dealing with.

Regarding the power, I've tried to explain areas that are misunderstood and provide insight from the actual engineers on the system. We are working with the technical folks to get more in-depth. As I said - they are more credible then I am, and can provide a lot more detail. Best I leave it to them.

Next stop - launch itself. Only then, when the games release and developers will inevitably be asked to compare the systems, will we there be a satisfying answer.

Until then, as I have been, I'll try and answer what I can. But I'm not going to add more on this topic.

Regardless of the shit being slung at you thanks for coming here and being a face for your company. MS sorely needs more people like you.
 

Gaz_RB

Member
Last post? Not even close. Sorry - still many more meme's yet to come.

As was kindly suggested by someone in a PM, it's unlikely anything more I'm going to post on this topic will make it better. People can believe me and trust I'm passing on the right information, or believe I'm spreading FUD and lies.

When I first started coming on, I said what I wanted to do was speak more directly and more honestly with the community, clarifying what we could because you guys have more detailed questions then we had been dealing with.

Regarding the power, I've tried to explain areas that are misunderstood and provide insight from the actual engineers on the system. We are working with the technical folks to get more in-depth. As I said - they are more credible then I am, and can provide a lot more detail. Best I leave it to them.

Next stop - launch itself. Only then, when the games release and developers will inevitably be asked to compare the systems, will we there be a satisfying answer.

Until then, as I have been, I'll try and answer what I can. But I'm not going to add more on this topic.

That's all you can do I guess.

I respect that you haven't run away with your tail between your legs, props
 

Josman

Member
Albert, I understand you're not an expert on the X1's hardware, so trying to argue against a whole forum with experts on the subject is suicidal. If you're gonna say that there is not point arguing because you say so, the least you could do is accept an AMA, just take a list of questions, give them to your tech guy at MS, and come back with the answers, otherwise your reputation in here is pretty much damaged beyond repair, this is NeoGAF afterall.
 

hawk2025

Member
Last post? Not even close. Sorry - still many more meme's yet to come.

As was kindly suggested by someone in a PM, it's unlikely anything more I'm going to post on this topic will make it better. People can believe me and trust I'm passing on the right information, or believe I'm spreading FUD and lies.

When I first started coming on, I said what I wanted to do was speak more directly and more honestly with the community, clarifying what we could because you guys have more detailed questions then we had been dealing with.

Regarding the power, I've tried to explain areas that are misunderstood and provide insight from the actual engineers on the system. We are working with the technical folks to get more in-depth. As I said - they are more credible then I am, and can provide a lot more detail. Best I leave it to them.

Next stop - launch itself. Only then, when the games release and developers will inevitably be asked to compare the systems, will we there be a satisfying answer.

Until then, as I have been, I'll try and answer what I can. But I'm not going to add more on this topic.



Many, many people have now pointed out that, on your original post, the first bullet point **directly** contradicts the second.

I'm sorry, but if you won't even address something like that, which requires absolutely no technical knowledge other than basic algebra... No amount of arguments from authority will fix that, I'm afraid.
 

nib95

Banned
Last post? Not even close. Sorry - still many more meme's yet to come.

As was kindly suggested by someone in a PM, it's unlikely anything more I'm going to post on this topic will make it better. People can believe me and trust I'm passing on the right information, or believe I'm spreading FUD and lies.

When I first started coming on, I said what I wanted to do was speak more directly and more honestly with the community, clarifying what we could because you guys have more detailed questions then we had been dealing with.

Regarding the power, I've tried to explain areas that are misunderstood and provide insight from the actual engineers on the system. We are working with the technical folks to get more in-depth. As I said - they are more credible then I am, and can provide a lot more detail. Best I leave it to them.

Next stop - launch itself. Only then, when the games release and developers will inevitably be asked to compare the systems, will we there be a satisfying answer.

Until then, as I have been, I'll try and answer what I can. But I'm not going to add more on this topic.

Could you further elaborate on this post? Happy to be corrected but there seems to be a lack of information.

Ok just a quick run through of points...


  • What is this inherent inefficiency you speak of? Can you elaborate? It is not something I've ever heard mentioned.
  • Your second point contradicts your first. If 50% more CU performance is viable to inefficiencies, why would 6% extra performance not also be privy to the same thing?
  • How did you arrive at the 204gb/s figure for the Esram, can you elaborate? Also you realise this is a very disinginuous claim. YES the bandwidth can be added together in that the DDR3 and Esram can function simultaneously, but this tells only a small part of the full story. The Esram still only accounts for a meagre 32mb of space. The DDR3 ram, which is the bulk of the memory (8GB) is still limited to only 68gb/s, whilst the PS4's GDDR5 ram has an entire 8GB with 176gb/s bandwidth. This is a misleading way to present the argument of bandwidth differences.
  • How do you know you have 10% more cpu speed? You said you are unaware of the PS4's final specs, and rumours of a similar upclock have been floating around. It could also be argued that the XO has the more capable audio chip because the systems audio Kinect features are more demanding, something the PS4 does not have to cater to. Add to that, the PS4 does also have a (less capable) audio chip, along with a secondary custom chip (supposedly used for background processing). There's that to consider too.
  • That's good that Microsoft understands GPGPU, but that does not take away from the inherent GPGPU customisations afforded to the PS4. The PS4 also has 6 additional compute units, which is a pretty hefty advantage in this field.
  • This is factually wrong. With Onion plus Onion+ the PS4 also has 30gb/s bandwidth.
 

jayu26

Member
Albert, I understand you're not an expert on the X1's hardware, so trying to argue against a whole forum with experts on the subject is suicidal. If you're gonna say that there is not point arguing because you say so, the least you could do is accept an AMA, just take a list of questions, give them to your tech guy at MS, and come back with the answers, otherwise your reputation in here is pretty much damaged beyond repair, this is NeoGAF afterall.

I second this.
 

B.O.O.M

Member
Yikes I thought that last thread would have been a good indication that most of gaf doesn't take bs lightly

Few users being overly defensive about people calling Albert out won't change that either. It's fair game

imho this is my insignificant advise to MS/Panello etc. Embrace the weaker hardware angle or at least don't talk about it. Highlight additional value you bring with kinect, software etc. This is just harming your image and distrust in MS PR is only going to get worse
 

SPDIF

Member
Albert, I understand you're not an expert on the X1's hardware, so trying to argue against a whole forum with experts on the subject is suicidal. If you're gonna say that there is not point arguing because you say so, the least you could do is accept an AMA, just take a list of questions, give them to your tech guy at MS, and come back with the answers, otherwise your reputation in here is pretty much damaged beyond repair, this is NeoGAF afterall.

Let's not get crazy now.
 

Pain

Banned
Last post? Not even close. Sorry - still many more meme's yet to come.

As was kindly suggested by someone in a PM, it's unlikely anything more I'm going to post on this topic will make it better. People can believe me and trust I'm passing on the right information, or believe I'm spreading FUD and lies.

When I first started coming on, I said what I wanted to do was speak more directly and more honestly with the community, clarifying what we could because you guys have more detailed questions then we had been dealing with.

Regarding the power, I've tried to explain areas that are misunderstood and provide insight from the actual engineers on the system. We are working with the technical folks to get more in-depth. As I said - they are more credible then I am, and can provide a lot more detail. Best I leave it to them.

Next stop - launch itself. Only then, when the games release and developers will inevitably be asked to compare the systems, will we there be a satisfying answer.

Until then, as I have been, I'll try and answer what I can. But I'm not going to add more on this topic.
So you won't even try to defend your previous posts or elaborate on them? You're talking technical here, you can't just make factually incorrect statements and not try to further explain yourself when someone proves you wrong.

If you don't have the technical knowledge to talk about the Xbox One yourself, you shouldn't be talking about it like you do.
 

Bsigg12

Member
Then you didn't do anything and have left us in the same place as we were before.

The fact that you're not commenting on this is worrisome.

Try us. I mean it. People will understand, we're trying to understand, and you're not helping.

How is it worrisome? He isn't as technically inclined which he mentions. He posted was he was told by the engineer what he told us. Until they do a full breakdown, which I can't see coming until closer to launch because of the FCC confidentiality letter that is in effect until November 21st, we probably won't know how it all works.
 

Yoday

Member
I agree. They should really just give up on the graphics argument at this point, it's a losing battle. Instead they need to try and convince me and others while, despite having lesser graphics, theirs is the box for me and not the graphically better station.

I think I would try something like this:

We concede that the PS4 does have slightly better specs. But XBox One offers entertainment in ways that go beyond the graphical capabilities of a machine. Like the Kinect (explain), like the TC connectivity (explain), like our first party titles (explain).

We at Microsoft believe that graphics can only take you so far in video game entertainment and that there are better ways to improve user experiences.


Then back it up with things like Oculus Rift Support and Forza glasses...
When the mass gaming press are leading most people to believe that the two systems are identical in power, they have absolutely no reason to concede to the PS4 being more powerful. They aren't going to risk that kind of admission going viral and further lowering interest in their console just to appease GAF.
 

MORT1S

Member
512KB at 128 bytes / clock = 4096 clocks. Pretty simple maths man.

Sorry, I think you've misunderstood what Albert meant by "Last post? Not even close".

I believe he was referring to a post earlier that said that Albert's last post would be his last post in the thread.

Just something I noticed while trying to catch up on this topic.
 

Caronte

Member
Albert, I understand you're not an expert on the X1's hardware, so trying to argue against a whole forum with experts on the subject is suicidal. If you're gonna say that there is not point arguing because you say so, the least you could do is accept an AMA, just take a list of questions, give them to your tech guy at MS, and come back with the answers, otherwise your reputation in here is pretty much damaged beyond repair, this is NeoGAF afterall.

I agree with this.
 

Klocker

Member
Yeah, and regardless of the name, there's no doubt that if you're able to get that title, you're a genius as far as computer science is concerned.

This isn't the technical fellow that Albert was talking about, but one of them who's working on the Xbox One is Dave Cutler

Whoa.. wicked smaht!


Dave Cutler
Born
March 13, 1942 (age 71)
Lansing, Michigan, USA

Occupation
Senior Technical Fellow at Microsoft


VMS

In April 1975, Digital began a hardware project, code-named Star, to design on a 32-bit virtual address extension to its PDP-11. In June 1975, Cutler, together with Dick Hustvedt and Peter Lippman, were appointed the technical project leaders for the software project, code-named Starlet, to develop a totally new operating system for the Star family of processors...... snip

...Xbox

As of January 2012, a spokesperson for Microsoft has confirmed that Cutler is no longer working on Windows Azure, and has since joined the Xbox team.[6] No further information was provided as to what Cutler's role was, nor what he was working on within the team.

In May 2013, Microsoft announced the Xbox One console, and Cutler was mentioned as having worked in the development of host OS portion of the system running inside the new gaming device. Apparently Cutler's work was focused in creating an optimized version of Microsoft's Hyper-V Host OS specifically designed for Xbox One.
 

KidBeta

Junior Member
Sorry, I think you've misunderstood what Albert meant by "Last post? Not even close".

I believe he was referring to a post earlier that said that Albert's last post would be his last post in the thread.

Just something I noticed while trying to catch up on this topic.

Thanks I feel silly now :[
 
Status
Not open for further replies.
Top Bottom