Ah Gaf, where everyone is a programmer and computer engineer...
Digital Foundry: The GPU compute comparison seems to be about Xbox One's high coherent read bandwidth vs. raw ALU on PS4. But don't the additional ACEs added to PS4 aim to address that issue?
Andrew Goossen: The number of asynchronous compute queues provided by the ACEs doesn't affect the amount of bandwidth or number of effective FLOPs or any other performance metrics of the GPU. Rather, it dictates the number of simultaneous hardware "contexts" that the GPU's hardware scheduler can operate on any one time. You can think of these as analogous to CPU software threads - they are logical threads of execution that share the GPU hardware. Having more of them doesn't necessarily improve the actual throughput of the system - indeed, just like a program running on the CPU, too many concurrent threads can make aggregate effective performance worse due to thrashing. We believe that the 16 queues afforded by our two ACEs are quite sufficient.
Except, I AM a programmer, and a computer engineer. You know what I program? Intensive real-time 3D applications using 3D engines. Sometimes for consoles.
prove what exactly? that 1.84 TFLOPS> 1.18 TFLOPS? have you have ever heard of "math" and how it works? the PS4 is objectively is more powerful, i don't need shills like RL to tell my otherwise.
Honestly the way he presents Microsoft's strengths, and misrepresents the PS4's (to the point of implying they are in fact weaknesses) has been consistent since the console reveals some months ago.
But the most obvious is when he tried to downplay the difference in "compute" performance with wildly irrespective PC GPUs in a tapestry of misrepresentation, goalpost dancing, and either utter incompetence or an unshakable agenda.
People have said he is single-handedly dragging down Digital Foundry's reputation, and I have to agree.
So...do you think when he comes to doing his face off's in the coming years, he's going to fake videos and screen shots and basically lie through his teeth in order to keep MS happy?
..because that's what all haters of RL are implying with their pathetic character assassination, who's been in the industry for nearly 30yrs and knows more about this stuff than 99% of the inane fan boys that come out with this shit.
I think the digital foundry bias toward Microsoft is evident by now, they've basically become a PR outlet for them, Microsoft pushes their spins, Eurogamer gets the clicks, both parties happy
This isn't about which console is more powerful, that's clearly the PS4, it's about the likes of you making baseless accusations that he's deliberately pushing an agenda of Microsoft's liking, whether formally or informally.
Guess you missed the articles where he clearly states the PS4 is the more powerful machine when it comes to GPU power.
No, that's just him reporting what the Xbox engineers are telling him, he's made no such decisions as to whether it will accurately reflect what we see on our screens over the coming years.
He IS digital foundry, he's the one who founded it.
The GPU compute comparison seems to be about Xbox One's high coherent read bandwidth vs. raw ALU on PS4. But don't the additional ACEs added to PS4 aim to address that issue?
Because when the 360 wins, it's because it's better, when the PS3 wins, it's because the developers are incompetent.I've never got this, why are they bias? Because 360 games almost always came out top in the console face offs?
Salt?
I rather wish they'd just posted the full interview from the beginning.
It's an interesting read, but not a huge amount of new interesting stuff vs the articles published in advance of this.
However, the big surprise? They were upfront and decided not to play the eSRAM latency card at all in either a rendering or GPGPU context. I'm surprised, I thought they would try to fudge around that by amplifying some corner case benefits. It also - again - clarifies the only real purpose for the eSRAM, bandwidth.
The clarification that the CPU only has 'very slow' access to eSRAM is also interesting given Leadbetter's earlier comments about Microsoft viewing 'low latency access to shared memory' as key to good GPGPU performance. I guess Richard misinterpreted something.
The GPGPU commentary was interesting because they seemed to acknowledge the importance of more ALU into the future, but the only answer they seemed to have that was different to tweaks Sony is also making is the amount of coherent bandwidth. I'm not sure how XB1 actually compares here, I think the numbers we've seen from Sony are split out a little differently, but my underlying thought while reading that is that while GPGPU might be able to saturate 30GB/s of DDR3 bandwidth in pure GPGPU benchmarks, I really wonder if there'd be that kind of bandwidth going to spare in a mixed compute/graphics scenario in a real game on XB1. It would be much easier for PS4 to fully utilise whatever coherent and incoherent bandwidth it has available during GPGPU tasks. Dithering around the ACE question was unconvincing also, the lack of performance of context switching is something AMD has flagged too as to be addressed in future architectures.
are you honestly telling me you haven't noticed how RL is downplaying the PS4 power advantages? here just read this:
I'll just leave it here http://www.digitalfoundry.org/showcase2/high_definition.html
Isn't that just a case of how you build rapport in an interview?
If you want to see examples of bad rapport to look at Angry Joe interviewing major Nelson. As soon as AJ tried to start asking awkward questions and disagreeing with the rep of the company he was interviewing , MN just went into shut down.
If I'm trying to get something from someone I don't make them feel awkward.
MS (well, these guys in the interview) are not saying Xbox is more powerful than PS4. They carefully avoid specific comparisons. They are purely talking about the Xbox in isolation.
I realise this always ends up in a 'but which is better' but this article doesn't really set out to answer that - it is there to help us better understand how Xbox is setup. For me, it does that nicely, and raises some questions I'd like to know more about - like the comments about having your render targets split between ESRAM and DDR3.
It is probably a more detailed look than we've had for PS4 so far (although that architecture does seem more straightforward so perhaps a deeper look isn't as necessary)
Because when the 360 wins, it's because it's better, when the PS3 wins, it's because the developers are incompetent.
yeah, but he would at least hint that it's not accurate.
MS (well, these guys in the interview) are not saying Xbox is more powerful than PS4. They carefully avoid specific comparisons. They are purely talking about the Xbox in isolation.
I realise this always ends up in a 'but which is better' but this article doesn't really set out to answer that - it is there to help us better understand how Xbox is setup. For me, it does that nicely, and raises some questions I'd like to know more about - like the comments about having your render targets split between ESRAM and DDR3.
It is probably a more detailed look than we've had for PS4 so far (although that architecture does seem more straightforward so perhaps a deeper look isn't as necessary)
Andrew Goossen said:The number of asynchronous compute queues provided by the ACEs doesn't affect the amount of bandwidth or number of effective FLOPs or any other performance metrics of the GPU. Rather, it dictates the number of simultaneous hardware "contexts" that the GPU's hardware scheduler can operate on any one time. You can think of these as analogous to CPU software threads - they are logical threads of execution that share the GPU hardware. Having more of them doesn't necessarily improve the actual throughput of the system - indeed, just like a program running on the CPU, too many concurrent threads can make aggregate effective performance worse due to thrashing. We believe that the 16 queues afforded by our two ACEs are quite sufficient.
Wait, isn't that technically trying to put it in PS4's favour?
Now this is
How about this part? I call this downplaying the competition
MS (well, these guys in the interview) are not saying Xbox is more powerful than PS4. They carefully avoid specific comparisons. They are purely talking about the Xbox in isolation.
It is probably a more detailed look than we've had for PS4 so far (although that architecture does seem more straightforward so perhaps a deeper look isn't as necessary)
Not really. The dude should be in the left as that's where the majority of the weight is, unless the rocks are different densities and the xbox side is in actuality heavier than the ps side.
Brown nose them in the interview then slag them off in the article?
Maybe burning bridges isn't the best thing to do.
That's cyclical logic. Essentially, the best version is just the lead SKU almost every time. Why is the decision to have the PS3 be the lead SKU so rare is a valid question, but the answer is so clean cut there's not much discussion to be warranted from it.Surely that is the case on many occasions? If 9/10 versions see the 360 version performing better, then 1/10 games sees the 360 version falling on it's face wouldn't you ask the question, why can't these developers do what the other 9 did?
I'll just leave it here http://www.digitalfoundry.org/showcase2/high_definition.html
I'll just leave it here http://www.digitalfoundry.org/showcase2/high_definition.html
"The biggest source of your frame-rate drops actually comes from the CPU, not the GPU"
Difficult to avoid the comparison when the question asked itself implies the comparison.How about this part? I call this downplaying the competition
If they said nothing, people would have noticed. They did say something and people noticed. It's a catch 22.DF: The GPU compute comparison seems to be about Xbox One's high coherent read bandwidth vs. raw ALU on PS4. But don't the additional ACEs added to PS4 aim to address that issue?
Pertaining to PS4, while it will still remain ahead of Xbone in terms of raw power, how do we know that a certain % of GPU power (as it is with CPU) is locked away for non gaming purposes?
I've seen you post this a couple times now. I'm genuinely curious what your point/goal is?This is from the last big thread.
X1 GPU:
1.18 TF GPU (12 CUs) for games
768 Shaders
48 Texture units
16 ROPS
2 ACE/ 16 queues
PS4 GPU:
1.84TF GPU ( 18 CUs) for games + 56%
1152 Shaders +50%
72 Texture units +50%
32 ROPS + 100%
8 ACE/64 queues +400%
I'll just leave it here http://www.digitalfoundry.org/showcase2/high_definition.html
Yes, that is super interesting. Even though, as a programmer, I'm sure I'd much prefer the more "elegant" and straightforward PS4 architecture, the XB1 team went through more technological choices which makes more interesting stories to read about (that's not the programmer in me speaking, that's the engineer). It was the same with other "complicated" hardware architecture (Amiga, SNES, Saturn, PS2, PS3...), always hard to apprehend but often brilliant in their approach too.While some of the article is certainly about how it compares to PS4, mostly it's just about the design itself. You can admire the thought process and technology of a device even if there is a superior product offered by the competition.
I don't think he is a shill. Microsoft is just providing content to him as a part of their marketing push and Sony just doesn't probably because Sony's PR guys are satisfied with the public perception of the console.Is Leadbetter a shill? I don't know, but I also don't know if it matters. It's not like he lies in his comparisons, he's sometimes wrong, but he tends to correct those things when they happen. When PS3 games are better, it's typically about how the developers are incompetent, which I think is obnoxious, but ultimately, who cares? If the PS4 versions of games are better, it's not like he's not going to report it.
I've seen you post this a couple times now. I'm genuinely curious what your point/goal is?
Sigh...why so many idiots like this.