• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Virtual testing of PS4 and XBO GPUs prove PS4 has bigger grafix numbers

Status
Not open for further replies.

foxbeldin

Member
vum.png
 

Gurish

Member
No people were saying it was a 7850. The difference between the 50 and 70 is pretty decent.

I remembered that people said it was between 7850 and 7870 and closer to 70, according to the chart it doesn't seem like it, it seems closer to 7850.
 

driver116

Member
They have already said several times it's the "most powerful console ever made".
And we have the specs.
Just look at the spec sheets and you can see that the PS4 is clearly more poweful.

I mean, look at this (pic on the site, which the OP mentioned):


Xbone: 1.31 TF GPU (12 CUs) for games
Xbone: 768 Shaders
Xbone: 48 Texture units
Xbone: 16 ROPS
Xbone: 2 ACE/ 16 queues

PS4: 1.84TF GPU ( 18 CUs) for games
PS4: 1152 Shaders +50%
PS4: 72 Texture units +50%
PS4: 32 ROPS + 100%
PS4: 8 ACE/64 queues +300%

We are surely not talking about a 14% difference here ;)
And the PS4's better memory/RAM solution is another big advantage.

But yes.... we already know all that (for months) now.

Intreresting, I wonder if Sony will bump the GPU clock speed in the future.
 

ethomaz

Banned
23 + 27 = 50 is listed as the actual absolute theoretical limit of the XB1 as mentioned in the article



The 32 - 35 is the author somehow [seemingly magically] drawing what the actual performance maximum possible is.

I was incorrect to use theoretical as I did in my previous statement

That all being said, I pretty much see this as BS until we have some comparisons made to other benchmarking tests showing similar real world differences between PC cards
I guess you are reading it wrong... 50 is what you can't use to measure theoretical performance of XB1... 35 is the max theoretical for it and 45 is the max two rival for PS4.

Neither console will reach these theoretical numbers in real life cases... maybe PS4 get close to 40 and XB1 close to 30 in best cases scenarios.
 

Renekton

Member
I remembered that people said it was between 7850 and 7870 and closer to 70, according to the chart it doesn't seem like it, it seems closer to 7850.
Yup, it has more shader units than 7850 (1152 vs 1024) but slower clock (800 vs 860ish), so it edges stock 7850 slightly.
 

theDeeDubs

Member
Will the inclusion of eSRAM cause problems for backwards compatibility in the future compared to what I'm assuming is a more standard set up in the PS4?
 

twobear

sputum-flecked apoplexy
Will the inclusion of eSRAM cause problems for backwards compatibility in the future compared to what I'm assuming is a more standard set up in the PS4?

Not necessarily, I think, since the peak bandwidth of the eSRAM is easily achievable on GDDR5.
 

Havel

Member
Yup, it has more shader units than 7850 (1152 vs 1024) but slower clock (800 vs 860ish), so it edges stock 7850 slightly.

The problem is that these are just virtual benchmarks and don't take into account certain advantages that can only be proven in real life performance.

I don't see how the article can state they are confident that there is basically zero deviation from these virtual tests and real performance.
 
They have already said several times it's the "most powerful console ever made".
And we have the specs.
Just look at the spec sheets and you can see that the PS4 is clearly more poweful.

I mean, look at this (pic on the site, which the OP mentioned):


Xbone: 1.31 TF GPU (12 CUs) for games
Xbone: 768 Shaders
Xbone: 48 Texture units
Xbone: 16 ROPS
Xbone: 2 ACE/ 16 queues

PS4: 1.84TF GPU ( 18 CUs) for games
PS4: 1152 Shaders +50%
PS4: 72 Texture units +50%
PS4: 32 ROPS + 100%
PS4: 8 ACE/64 queues +300%

We are surely not talking about a 14% difference here ;)
And the PS4's better memory/RAM solution is another big advantage.

But yes.... we already know all that (for months) now.

I love this comparison stack.

Every time a new bit of PR comes put from MS trying to diminish the gap in power, you just have to look at this to see the difference is freaking huge for two consoles priced the same.

The Xbone really should be $75 cheaper or something.
 

jaypah

Member
I love this comparison stack.

Every time a new bit of PR comes put from MS trying to diminish the gap in power, you just have to look at this to see the difference is freaking huge for two consoles priced the same.

The Xbone really should be $75 cheaper or something.

They should be the same price with Kinect included. Seems fair.
 
I could do a post explaining the technical side of this stuff if you guys want and disspell some fundamental misconceptions in this thread.

It'd be a giant wall of text though - no way to get around that
 
I'm having deja vu and I'm a junior...

What is the point of this again and again...it's common knowledge now that the PS4 is the more "powerful" system...

Isn't this just Ego stroking now?
 

Piggus

Member
People thought the 10% GPU boost would make XB1 close to PS4 and bring parity not realising we knew the specs of both systems since last year and the talk of the performance gap was taking in account the full specs of the consoles. The 10% MS reserved just made things worse.

The people who thought/still think that don't actually know anything about the hardware and slurp up MS's PR like hot ramen.
 
all 32 MB of it (lol)

This controversy is rather surprising to me, especially when you view ESRAM as the evolution of eDRAM from the Xbox 360. No-one questions on the Xbox 360 whether we can get the eDRAM bandwidth concurrent with the bandwidth coming out of system memory. In fact, the system design required it. We had to pull over all of our vertex buffers and all of our textures out of system memory concurrent with going on with render targets, colour, depth, stencil buffers that were in eDRAM.

And that was with 10MB, so not quite as funny as you would think. eSRAM requires more careful planning to make the best use of it, but let's not act like Microsoft hasn't already proven quite clearly that they know exactly what they're doing when it comes to this kind of stuff.

2557385-4519761117-iSrhJ.gif


Show me a PC 7850 that can duplicate that @ 60fps

consoles have always been able to do much more with much weaker on paper hardware.

Come on, that isn't even gameplay. It's a cutscene. And with Directx12 coming to the PC, and things like Mantle already available, a 7850 is likely quite capable of the above. PS4 is a pretty powerful console, we all know this, but the worship of its hardware prowess sometimes goes in some interesting directions. :)
 

Genio88

Member
2557385-4519761117-iSrhJ.gif


Show me a PC 7850 that can duplicate that @ 60fps

consoles have always been able to do much more with much weaker on paper hardware.

If Uncharted 4 gameplay will really be like that, and i mean the real gameplay, not the cutscene which were already great in Uncharted1-3 and The Last Of Us, then i'm gonna sell my PC and buy 2 PS4
 
It is not surprising the console with more powerful hardware can produce better graphics. It really depends on the developer to give a crap to utilize the potential. I am worried that 3rd party games will keep not using the PS4 to its fullest due to XBO holding it back. Oh well, 900p and 30 FPS ports here we come...
 
And that was with 10MB, so not quite as funny as you would think. eSRAM requires more careful planning to make the best use of it, but let's not act like Microsoft hasn't already proven quite clearly that they know exactly what they're doing when it comes to this kind of stuff.

Vertex Buffer objects are models and textures are textures. That quote is consistent with the eSRAM being used for just gbuffs in which case be prepared for 1408x792 as that's the single most efficient use of it above 720p
 
Approximately 33% to 50% better based on the scenario.

So, you're telling me all those months of denial on MS's part were for naught? We had the whole Albert Pennello ordeal with his technical fellows...yet at the end of the day, reality still winds up aligning with our initial expectations.

You_don%27t_say%3F.jpg
 
It is not surprising the console with more powerful hardware can produce better graphics. It really depends on the developer to give a crap to utilize the potential. I am worried that 3rd party games will keep not using the PS4 to its fullest due to XBO holding it back. Oh well, 900p and 30 FPS ports here we come...

You should be more worried about cross-gen ports holding back your PS4 than the XO holding back your PS4.
 

omonimo

Banned
I'm more intrigued about 8 ACE on the ps4 . It's very high. Even more than a 7970. Very curious what mean in the future of the graphic.
 
Vertex Buffer objects are models and textures are textures. That quote is consistent with the eSRAM being used for just gbuffs in which case be prepared for 1408x792 as that's the single most efficient use of it above 720p

Says you. Plenty of great looking games being released at higher resolutions suggests otherwise. Maybe, just maybe, it will vary from game to game and from developer to developer. There is no simple resolution equation that anyone here or anywhere else can come up with that will apply as an absolute across the entire spectrum of game releases. Many things have been said, and yet there continues to be Xbox One games that simply don't follow those rules.

I'm more intrigued about 8 ACE on the ps4 . It's very high. Even more than a 7970. Very curious what mean in the future of the graphic.

It should give developers looking to use GPU compute a far greater level of control over how the hardware handles their compute workload, which will surely have its benefits. This has always been the beauty of consoles, and why I love them so much. With more experience developers will begin taking advantage of all kinds of interesting hardware specific optimizations or tricks that squeeze all kinds of interesting results from the system. Naughty Dog in particular should do some pretty incredible things with it. As incredible as Uncharted 4 will probably be, it's Uncharted 5 or whatever major project they have planned afterwards that will most likely demonstrate their mastery of the hardware. I was among those that weren't all that impressed with the first Uncharted on the PS3, but Uncharted 2 blew me away.
 

mrklaw

MrArseFace
Says you. Plenty of great looking games being released at higher resolutions suggests otherwise. Maybe, just maybe, it will vary from game to game and from developer to developer. There is no simple resolution equation that anyone here or anywhere else can come up with that will apply as an absolute across the entire spectrum of game releases. Many things have been said, and yet there continues to be Xbox One games that simply don't follow those rules.



It should give developers looking to use GPU compute a far greater level of control over how the hardware handles their compute workload, which will surely have its benefits. This has always been the beauty of consoles, and why I love them so much. With more experience developers will begin taking advantage of all kinds of interesting hardware specific optimizations or tricks that squeeze all kinds of interesting results from the system. Naughty Dog in particular should do some pretty incredible things with it. As incredible as Uncharted 4 will probably be, it's Uncharted 5 or whatever major project they have planned afterwards that will most likely demonstrate their mastery of the hardware. I was among those that weren't all that impressed with the first Uncharted on the PS3, but Uncharted 2 blew me away.


I'm really looking forward to what console devs do with GPGPU - most likely on PS4 as it has more capacity to free some up for GPGPU, more ACEs, and several dev teams already very familiar with that style of programming after the PS3's SPEs.

And that work will potentially feed back into PC games too, so it is good for everyone.
 
Says you. Plenty of great looking games being released at higher resolutions suggests otherwise. Maybe, just maybe, it will vary from game to game and from developer to developer. There is no simple resolution equation that anyone here or anywhere else can come up with that will apply as an absolute across the entire spectrum of game releases. Many things have been said, and yet there continues to be Xbox One games that simply don't follow those rules.

NO

There is no real world workload where the XBO can out-preform the PS4. Developers cannot preform miracles.
 

Matt

Member
I know you are under NDA, Matt, but there was talk about the eSRAM being much more capable than initially thought. Could not that bandwidth help the One bridge the gap, at least in terms of memory capabilities, if only a little..?
Well, yes, the ESRAM does help bridge the gap, so to speak, in that without it the XBO basically wouldn't work. But it does not and simply cannot make the memory system of the XBO "close" in ability to the PS4 (close of course being relative in this context).
 
Who is this lady?

Chloe Dykstra. She's Chris Hardwick's ex-girlfriend (he said on Twitter last week that they "decided to part ways"). Her father, John, is pretty famous. He pioneered motion-camera technology, which made many of the effects (particularly the look of ships flying in space) in the original Star Wars Trilogy possible.
 
Long have PS4 fans claimed the PS4 to be so much more powerful, citing numbers like 50 % more powerful. But these numbers and some simple math show that the difference is much smaller than that.

The PS4 bar is at 42 points and the first XBO bar at 28 points.

42 - 28 = 14

This would make the PS4 14 % more powerful, which by itself is almost negligible. Add to this the fact that the article states that optimal memory management would give better results than the top bar, and the performance improvements of DirectX 12 and beyond and the power of the clowd, and we should arrive at parity or maybe even make XBO slightly more powerful.

So worst case scenario: PS4 is 14 % more powerful. Best case scenario: XBO is slightly more powerful. So I don't see how this article causes so much celebration among Sony Gaf.

Lol

Says you. Plenty of great looking games being released at higher resolutions suggests otherwise. Maybe, just maybe, it will vary from game to game and from developer to developer. There is no simple resolution equation that anyone here or anywhere else can come up with that will apply as an absolute across the entire spectrum of game releases. Many things have been said, and yet there continues to be Xbox One games that simply don't follow those rules.

Always entertaining
 

belmonkey

Member
It should give developers looking to use GPU compute a far greater level of control over how the hardware handles their compute workload, which will surely have its benefits. This has always been the beauty of consoles, and why I love them so much. With more experience developers will begin taking advantage of all kinds of interesting hardware specific optimizations or tricks that squeeze all kinds of interesting results from the system. Naughty Dog in particular should do some pretty incredible things with it. As incredible as Uncharted 4 will probably be, it's Uncharted 5 or whatever major project they have planned afterwards that will most likely demonstrate their mastery of the hardware. I was among those that weren't all that impressed with the first Uncharted on the PS3, but Uncharted 2 blew me away.

Wouldn't GPGPU basically remove CPU bottlenecks for games like Watch Dogs or BF4 MP, where more CPU performance might be needed?
 
Status
Not open for further replies.
Top Bottom