Caayn
Member
That's true however that doesn't suddenly get you 7970 performance from the PS4 gpu for example.You're acting like Sony just grabbed a 7850 off the shelf and used it for the PS4. It's custom designed and has advantages.
That's true however that doesn't suddenly get you 7970 performance from the PS4 gpu for example.You're acting like Sony just grabbed a 7850 off the shelf and used it for the PS4. It's custom designed and has advantages.
Thanks!
No people were saying it was a 7850. The difference between the 50 and 70 is pretty decent.
They have already said several times it's the "most powerful console ever made".
And we have the specs.
Just look at the spec sheets and you can see that the PS4 is clearly more poweful.
I mean, look at this (pic on the site, which the OP mentioned):
Xbone: 1.31 TF GPU (12 CUs) for games
Xbone: 768 Shaders
Xbone: 48 Texture units
Xbone: 16 ROPS
Xbone: 2 ACE/ 16 queues
PS4: 1.84TF GPU ( 18 CUs) for games
PS4: 1152 Shaders +50%
PS4: 72 Texture units +50%
PS4: 32 ROPS + 100%
PS4: 8 ACE/64 queues +300%
We are surely not talking about a 14% difference here
And the PS4's better memory/RAM solution is another big advantage.
But yes.... we already know all that (for months) now.
I guess you are reading it wrong... 50 is what you can't use to measure theoretical performance of XB1... 35 is the max theoretical for it and 45 is the max two rival for PS4.23 + 27 = 50 is listed as the actual absolute theoretical limit of the XB1 as mentioned in the article
The 32 - 35 is the author somehow [seemingly magically] drawing what the actual performance maximum possible is.
I was incorrect to use theoretical as I did in my previous statement
That all being said, I pretty much see this as BS until we have some comparisons made to other benchmarking tests showing similar real world differences between PC cards
Yup, it has more shader units than 7850 (1152 vs 1024) but slower clock (800 vs 860ish), so it edges stock 7850 slightly.I remembered that people said it was between 7850 and 7870 and closer to 70, according to the chart it doesn't seem like it, it seems closer to 7850.
Will the inclusion of eSRAM cause problems for backwards compatibility in the future compared to what I'm assuming is a more standard set up in the PS4?
That's true however that doesn't suddenly get you 7970 performance from the PS4 gpu for example.
Yup, it has more shader units than 7850 (1152 vs 1024) but slower clock (800 vs 860ish), so it edges stock 7850 slightly.
Not necessarily, I think, since the peak bandwidth of the eSRAM is easily achievable on GDDR5.
They have already said several times it's the "most powerful console ever made".
And we have the specs.
Just look at the spec sheets and you can see that the PS4 is clearly more poweful.
I mean, look at this (pic on the site, which the OP mentioned):
Xbone: 1.31 TF GPU (12 CUs) for games
Xbone: 768 Shaders
Xbone: 48 Texture units
Xbone: 16 ROPS
Xbone: 2 ACE/ 16 queues
PS4: 1.84TF GPU ( 18 CUs) for games
PS4: 1152 Shaders +50%
PS4: 72 Texture units +50%
PS4: 32 ROPS + 100%
PS4: 8 ACE/64 queues +300%
We are surely not talking about a 14% difference here
And the PS4's better memory/RAM solution is another big advantage.
But yes.... we already know all that (for months) now.
Not necessarily, I think, since the peak bandwidth of the eSRAM is easily achievable on GDDR5.
I love this comparison stack.
Every time a new bit of PR comes put from MS trying to diminish the gap in power, you just have to look at this to see the difference is freaking huge for two consoles priced the same.
The Xbone really should be $75 cheaper or something.
Am I wrong in assuming the PS4 has a GPU that is close to the performance of a GTX 660 because of this?
I mean everyone knows the ps4 is more powerful, whats to settle? and I see we already have the pixel titanfalll gif. maybe someone will quote misterx soon.
But peak bandwidth doesn't compensate latency. The esram is way "nearer" to computing units than RAM will be.
I'm sorry, but that's just not the case. In essentially every way, the PS4's hardware is more powerful than the XBO's.
I knew someone was going to do that.
I'm glad it was you.
Maybe we should go trough these same things on every performance thread...Should we ban all threads regarding this issue?
People thought the 10% GPU boost would make XB1 close to PS4 and bring parity not realising we knew the specs of both systems since last year and the talk of the performance gap was taking in account the full specs of the consoles. The 10% MS reserved just made things worse.
Show me a PC 7850 that can duplicate that @ 60fps
all 32 MB of it (lol)
This controversy is rather surprising to me, especially when you view ESRAM as the evolution of eDRAM from the Xbox 360. No-one questions on the Xbox 360 whether we can get the eDRAM bandwidth concurrent with the bandwidth coming out of system memory. In fact, the system design required it. We had to pull over all of our vertex buffers and all of our textures out of system memory concurrent with going on with render targets, colour, depth, stencil buffers that were in eDRAM.
Show me a PC 7850 that can duplicate that @ 60fps
consoles have always been able to do much more with much weaker on paper hardware.
Show me a PC 7850 that can duplicate that @ 60fps
consoles have always been able to do much more with much weaker on paper hardware.
And that was with 10MB, so not quite as funny as you would think. eSRAM requires more careful planning to make the best use of it, but let's not act like Microsoft hasn't already proven quite clearly that they know exactly what they're doing when it comes to this kind of stuff.
It is not surprising the console with more powerful hardware can produce better graphics. It really depends on the developer to give a crap to utilize the potential. I am worried that 3rd party games will keep not using the PS4 to its fullest due to XBO holding it back. Oh well, 900p and 30 FPS ports here we come...
Wait a minute! The benchmark has been updated!
Vertex Buffer objects are models and textures are textures. That quote is consistent with the eSRAM being used for just gbuffs in which case be prepared for 1408x792 as that's the single most efficient use of it above 720p
I'm more intrigued about 8 ACE on the ps4 . It's very high. Even more than a 7970. Very curious what mean in the future of the graphic.
Says you. Plenty of great looking games being released at higher resolutions suggests otherwise. Maybe, just maybe, it will vary from game to game and from developer to developer. There is no simple resolution equation that anyone here or anywhere else can come up with that will apply as an absolute across the entire spectrum of game releases. Many things have been said, and yet there continues to be Xbox One games that simply don't follow those rules.
It should give developers looking to use GPU compute a far greater level of control over how the hardware handles their compute workload, which will surely have its benefits. This has always been the beauty of consoles, and why I love them so much. With more experience developers will begin taking advantage of all kinds of interesting hardware specific optimizations or tricks that squeeze all kinds of interesting results from the system. Naughty Dog in particular should do some pretty incredible things with it. As incredible as Uncharted 4 will probably be, it's Uncharted 5 or whatever major project they have planned afterwards that will most likely demonstrate their mastery of the hardware. I was among those that weren't all that impressed with the first Uncharted on the PS3, but Uncharted 2 blew me away.
Says you. Plenty of great looking games being released at higher resolutions suggests otherwise. Maybe, just maybe, it will vary from game to game and from developer to developer. There is no simple resolution equation that anyone here or anywhere else can come up with that will apply as an absolute across the entire spectrum of game releases. Many things have been said, and yet there continues to be Xbox One games that simply don't follow those rules.
Well, yes, the ESRAM does help bridge the gap, so to speak, in that without it the XBO basically wouldn't work. But it does not and simply cannot make the memory system of the XBO "close" in ability to the PS4 (close of course being relative in this context).I know you are under NDA, Matt, but there was talk about the eSRAM being much more capable than initially thought. Could not that bandwidth help the One bridge the gap, at least in terms of memory capabilities, if only a little..?
I mean, they are both not great. And they aren't far apart, but I wouldn't exactly say equal.Aren't their CPUs equally shitty?
Who is this lady?
Everytime this is misquoted and is no longer relevant, a console warrior gets his wings.
Long have PS4 fans claimed the PS4 to be so much more powerful, citing numbers like 50 % more powerful. But these numbers and some simple math show that the difference is much smaller than that.
The PS4 bar is at 42 points and the first XBO bar at 28 points.
42 - 28 = 14
This would make the PS4 14 % more powerful, which by itself is almost negligible. Add to this the fact that the article states that optimal memory management would give better results than the top bar, and the performance improvements of DirectX 12 and beyond and the power of the clowd, and we should arrive at parity or maybe even make XBO slightly more powerful.
So worst case scenario: PS4 is 14 % more powerful. Best case scenario: XBO is slightly more powerful. So I don't see how this article causes so much celebration among Sony Gaf.
Says you. Plenty of great looking games being released at higher resolutions suggests otherwise. Maybe, just maybe, it will vary from game to game and from developer to developer. There is no simple resolution equation that anyone here or anywhere else can come up with that will apply as an absolute across the entire spectrum of game releases. Many things have been said, and yet there continues to be Xbox One games that simply don't follow those rules.
It should give developers looking to use GPU compute a far greater level of control over how the hardware handles their compute workload, which will surely have its benefits. This has always been the beauty of consoles, and why I love them so much. With more experience developers will begin taking advantage of all kinds of interesting hardware specific optimizations or tricks that squeeze all kinds of interesting results from the system. Naughty Dog in particular should do some pretty incredible things with it. As incredible as Uncharted 4 will probably be, it's Uncharted 5 or whatever major project they have planned afterwards that will most likely demonstrate their mastery of the hardware. I was among those that weren't all that impressed with the first Uncharted on the PS3, but Uncharted 2 blew me away.
I mean, they are both not great. And they aren't far apart, but I wouldn't exactly say equal.
I'm sorry, but that's just not the case. In essentially every way, the PS4's hardware is more powerful than the XBO's.