• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

John Carmack on PS4 vs. Xbox One Specs: They're 'Very Close'

stryke

Member
5etM0Bql.gif
 
should be clear as night/day but we didnt see it and something dosnt add to me

coz i didnt feel AT ALL the differences
You cannot use what we've seen so far as any kind of determinant. Most (all?) of the games shown at E3 and elsewhere have been running on powerful PC hardware, nothing like what's in the final consoles.

Edit in response to your edit: stop using the DF article as any kind of source, their comparison is garbage, as has been discussed at great length here and elsewhere.
 

Krilekk

Banned
That could be straight from some PR flyer.

Which is probably what it is from, too many guerilla marketing morons these days. There, I said it.

I thought it was known that PS4 has 4.5 GB GDDR5 available to devs and X1 5 GB DDR3? There's that additional 512 MB on PS4 but that's for some caching/virtual disk thing. Might wanna update those PR flyers unless you want to be sued for pretending otherwise.
 

twobear

sputum-flecked apoplexy
In many ways these two are closer than a lot of previous generations. Neither of them have exotic hardware or the like. They're both based on the same CPU and GPU.

When all is said and done I expect the difference to be comparable to the gap between top-end PS3 games and top-end 360 games. There's a noticeable difference between something like Halo 4 and The Last of Us, or between Forza 4 and GT6. It's obvious, and PS3 games look notably better, but not like a generational gap. When people say that this generation 'was a wash' because the 360 had a more powerful GPU (not strictly true) and the PS3 a more powerful CPU, they ignore the fact that the PS3's CPU was much, much more powerful than the 360's whereas the GPU kinda really was a wash.

The main difference of course is that PS4 will have better multiplatform games. If nothing else the exact same assets will run better on PS4. Moreover, because the power is much more accessible, the difference will become apparent more quickly (I don't know why people think it will become more apparent at the end of the generation).
 

Pistolero

Member
@ The Kayle :


sigh...you don't make any sense. First, the power gap -which exists- is not a huge one. We're not talking about a two-three times more powerful machine, so the differences should be minimal (AA, more stable framerate, perhaps better post process...). Second, we don't have a way of comparing when there are many variables that create noise (How long was the developement of titles on each plateform? what about the budgets? what sacrifices were made to get the engine running?...). Plus, what are you comparing exactly : Ryse vs Knack?
 

gofreak

GAF's Bob Woodward
ok lets put it in this way

there are two cars

1 - a 1200 cc
2 - a 1800 cc turbo

both cars need to be presented to the customers doing some laps

You're going to have to wait until November to even begin see them doing the same laps.

I do not believe anyone has shown their game running on both pieces of hardware, or side by side. At E3 everyone was showing on one or the other - although, tellingly, mostly on PS4 or PC supposedly.

So we'll see when the systems and games are released to start to see how they compare 'on the same laps'. And we may have to wait a while longer to see them compete on next-gen exclusive multiplats vs cross-over games, where performance characteristics might be different again. In the meantime, though, for people trying to make a decision now around which to get or which to preorder, around which is more likely to offer best performance over the next couple of years of cross-generation multiplats, and next several years of next-gen exclusive multiplats, I'm not sure how anyone could argue PS4 isn't the safe bet.
 
Which is probably what it is from, too many guerilla marketing morons these days. There, I said it.

I thought it was known that PS4 has 4.5 GB GDDR5 available to devs and X1 5 GB DDR3? There's that additional 512 MB on PS4 but that's for some caching/virtual disk thing. Might wanna update those PR flyers unless you want to be sued for pretending otherwise.

Provide a source which definitively confirmed it is 4.5 GB, then we can say it is 'known' Otherwise, quite a few sources have said 4.5 GB + 1 GB flexible which can be used for either OS or games. Either way there has been no reliable confirmation. Of course, not surprising that you seem to go for all the lowest estimates though eh?
 

TheKayle

Banned
You cannot use what we've seen so far as any kind of determinant. Most (all?) of the games shown at E3 and elsewhere have been running on powerful PC hardware, nothing like what's in the final consoles.

Edit in response to your edit: stop using the DF article as any kind of source, their comparison is garbage, as has been discussed at great length here and elsewhere.

i judge GAMING console from GAMES they show to me...i dont use to masturbate my mind on cerny or major nelson words...if i need to go on specs i really dont like to talk about apu's machines..coz is like to wanna win a f1 race with a volvo

and again like the past gen there was 3rd party games that look better than exclusives (mgs5,the division)....exclusives running on what should be the less powerfull console that looked better than games running on the most powerful on paper ..and there will be some games that will show the power differences etc etc

thats it....another ps3 x360 gen
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
I have an ATI card and it doesn't look like that -- currently. If you do a reverse image search on that shot you'll see that it was taken a day after launch, which was before AMD got their shit together and released the correct drivers for the game.

It's kinda sad even to this day id catches so much flak for something that was entirely AMD's fault...

Yep. I remember reading how Carmack was hacked off because AMD promised to release the up to date driver that he'd been using during development.
 

TheKayle

Banned
@ The Kayle :


sigh...you don't make any sense. First, the power gap -which exists- is not a huge one. We're not talking about a two-three times more powerful machine, so the differences should be minimal (AA, more stable framerate, perhaps better post process...). Second, we don't have a way of comparing when there are many variables that create noise (How long was the developement of titles on each plateform? what about the budgets? what sacrifices were made to get the engine running?...). Plus, what are you comparing exactly : Ryse vs Knack?

oh this is the point....when ppl fill their mouth with a 50% is like they expecting something crazy but will never happen
im with you on this

i was comparing the whole ....ryse kz forza driveclub quantumbreak knack etc etc
 

TheKayle

Banned
You're going to have to wait until November to see them doing the same laps.

I do not believe anyone has shown their game running on both pieces of hardware, or side by side. At E3 everyone was showing on one or the other - although, tellingly, mostly on PS4 or PC supposedly.

So we'll see when the systems and games are released how they compare 'on the same laps'. In the meantime, though, for people trying to make a decision now around which to get or which to preorder, I'm not sure how anyone could argue PS4 isn't the safe bet.

clearly ps4 is the safer bet

but will MS playing all on games (givn ppl lots of exclusives ) knowing they got the weaker console? this could be a "game changer" too or not?
 

Pistolero

Member
i judge GAMING console from GAMES they show to me...i dont use to masturbate my mind on cerny or major nelson words...if i need to go on specs i really dont like to talk about apu's machines..coz is like to wanna win a f1 race with a volvo

and again like the past gen there was 3rd party games that look better than exclusives (mgs5,the division)....exclusives running on what should be the less powerfull console that looked better than games running on the most powerful on paper ..and there will be some games that will show the power differences etc etc

thats it....another ps3 x360 gen

MGS2 and ZOE2, on the PS2,were more impressive than many Xbox games, and the Xbox was a lot more powerful. They were made by talented developers and targetted the strenghts of the machine (bandwitdth, mainly...).
As for games looking better on the X-box One, I'd like to know which ones you're referring to...
 

beast786

Member
ok lets put it in this way

there are two cars

1 - a 1200 cc
2 - a 1800 cc turbo

both cars need to be presented to the customers doing some laps

before all of this start
we already know that the first (1.200) not only is slower but also car mechanics are in late to optimize the aerodinamics ..engine etc etc...everyone know this

the second car (1800 turbo) is ready their engine eeprom is well developed and optimized all work great and already on paper we know this car have at least 50% advantage on the first one


the day of the presentation arrive and they do their laps
when they run the 1800 turbo dont destroy the 1200 but istead they arrive pretty much equal or ppl need the photofinish coz probably the 1200 did better than the 1800 turbo

this is what happen with the presentation of the games running on both consoles
on paper the ps4 should destroy the xb1

if one of the two console have ..better developed driver (ps4) better devs tools (ps4) 50% more powerful gpu (ps4) unified pool of faster ram (ps4) more ram allocated for games (ps4) less heavy os (ps4)

you need years of learning curve to show the power differences?

and for example ALSO if both running not optimized games ..would be EASY for the most powerful console with this big gap to show the power differences.....that aso the worst developer should be good to show...

should be clear as night/day but we didnt see it and something dosnt add to me

coz i didnt feel AT ALL the differences and im sure that for the average joe graphics like that one of ryse or quantum break could be the best of what we seen

is this underwhelming for devs that work on a console that is so much better than the other ? or just the gap will be just whta DF showed to us ..in a number of frame that u can count in two hand fingers

You forgot the best part: 1800cc turbo is also 25% cheaper
 

TheKayle

Banned
MGS2 and ZOE2, on the PS2,were more impressive than many Xbox games, and the Xbox was a lot more powerful. They were made by talented developers and targetted the strenghts of the machine (bandwitdth, mainly...).
As for games looking better on the X-box One, I'd like to know which ones you're referring to...

i think that graphically talking Ryse looked impressive....but TO ME the best if the words of the remedy creative director are true and it was real time in engine Quantum Break was the best of the show (of course we need to see the gameplay)
 

Pistolero

Member
i think that graphically talking Ryse looked impressive....but TO ME the best if the words of the remedy creative director are true and it was real time in engine Quantum Break was the best of the show

That was a cutscene, not a gameplay segment. By that logic we should also be able to include The Order 1886, which was easily just as impressive, and the Dark Sorcerer demo, an even more impressive bit, proven to be real-time. As for the games playable on devkits (Ryse, Forza, Killzone, KI, Infamous SS), Infamous SS was the best of the bunch, imo.
 

badb0y

Member
yes some fps somethng around 25% in certain situation ( i think df talked about this)

this before the upclock..now something aroung 18%

Holy shit, those idiots over at Digital Foundry have really fucked up this conversation, haven't they?

To preface this post I want to say that although I admire what Digital Foundry tried to do but their methodologies were not only flawed but straight up wrong.

The first thing they did wrong was select the wrong graphics cards to correctly compare the 2 situations.

For the Xbox One they used a HD 7850 over a HD 7790 while for the PS4 they used a 7870 XT over a 7850. This is wrong because using the 7850 for the Xbox One thrusts it into a whole different category of graphics cards that bring many advantages not present in the Xbox One GPU (32 ROPs vs 16 ROPs).

The methodology they used would have been perfect if they grabbed a HD 7790 and clocked it to ~731 Mhz for the Xbox One's GPU and used a a HD 7850 and clocked it to 898 Mhz.

Xbox One = 1.310 Teraflops ------- (HD 7790)Bonaire @ 731 Mhz = .731*2*896 = ~1.310 Teraflops
PS4 = 1.84 Teraflops ------- (HD 7850)Pitcairn @ 898 Mhz = .898*2*1024 = ~1.84 Teraflops

I reckon using these cards at these clocks would really show the GPU difference between the two consoles but alas Digital Foundry went about this some other dumbass way.

To get to the point since I don't have the hardware I mentioned above on hand to conduct the tests myself but I can extrapolate data from graphics cards that are similar to what goes into these consoles. For the Xbox One that would be an HD 7770 because it has a similar amount of GPU power as the GPU used in the Xbox One (1.310 Tflops vs 1.28 Tflops) while for the PS4 I would use the HD 7850 (1.84 Tflops vs 1.76 Tflops) and then compare the results for a realistic look at the performance gap. Oh, by the way the gap between 1.28(HD 7770) and 1.76(HD 7850) is smaller than the gap between 1.31(Xbox One) and 1.84(PS4) so that's already giving a slight break to the Xbox One in this comparison.

For my analysis I am using benchmark numbers from Anandtech.com:
http://anandtech.com/bench/product/777?vs=778
Using the data above I came up with the following chart:
image

I submit to you that the performance difference between the Xbox One and the PS4 will be much more apparent than what Digital Foundry has hypothesized and that the overall difference in power will not only be tangible but possible even staggering.
 

TheKayle

Banned
That was a cutscene, not a gameplay segment. By that logic we should also be able to include The Order 1886, which was easily just as impressive, and the Dark Sorcerer demo, an even more impressive bit, proven to be real-time. As for the games playable on devkits (Ryse, Forza, Killzone, KI, Infamous SS), Infamous SS was the best of the bunch, imo.

as open world was WOW till i seen msg5 :(
 

gofreak

GAF's Bob Woodward
clearly ps4 is the safer bet

but will MS playing all on games (givn ppl lots of exclusives ) knowing they got the weaker console? this could be a "game changer" too or not?

I don't know.

I think a look at platform holders' track records in supporting exclusive development across a system's lifecycle might inform a potential buyer here. I don't think there's anything to indicate PS4 is a riskier bet on that front, quite to the contrary perhaps.

If someone has specific games they want that are only available on Xbox One, and are heart set on those franchises or games, obviously that's another scenario. But if someone wants 'the best choice' in general for a console for the next 6 or 7 years, with no specific ties, Xbox is a hard sell right now.
 
i judge GAMING console from GAMES they show to me...i dont use to masturbate my mind on cerny or major nelson words...if i need to go on specs i really dont like to talk about apu's machines..coz is like to wanna win a f1 race with a volvo
Obviously things like art direction and creative design will trump technical specs, but if you're going to be comparing titles across platforms...

and again like the past gen there was 3rd party games that look better than exclusives (mgs5,the division)....exclusives running on what should be the less powerfull console that looked better than games running on the most powerful on paper ..and there will be some games that will show the power differences etc etc
My point is you actually haven't seen anything of the sort. Everything you've seen was running on jacked up PC hardware. No one knows what those games will look like running on their actual consoles yet.

Unless again you're simply saying those supposedly weaker games appeal more to you visually, or had better direction in your opinion, which is all good. But then you really shouldn't bring up Digital Foundry or percentage differences, too much misinformation there.

thats it....another ps3 x360 gen
I really don't think it's going to be so cut and dry. Ironically despite far more similar base architectures, the overall directions Microsoft and Sony are pushing have diverged quite a bit (or rather, Microsoft is wanting to try new directions). Either way, I think going forward we'll see a lot more pronounced differences, both in games and otherwise this gen.
 
Does he say anything interesting about the Wii U and Vita (I've seen people saying he talks about them a bit)? (cannot watch vids at the moment)
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
To get to the point since I don't have the hardware I mentioned above on hand to conduct the tests myself but I can extrapolate data from graphics cards that are similar to what goes into these consoles.

You can't compare GPU's on PC running on the same driver running on the same code to Xbox One and PS4 custom hardware.

For starters, the drivers are different for Xbox One and PS4 and are optimised for the hardware. Also the rendering pipeline and render code base will be different. And finally devs are developing on a fix platform.

I've no doubt there might be difference but your analysis method is severely flawed.
 

TheKayle

Banned
I don't know.

I think a look at platform holders' track records in supporting exclusive development across a system's lifecycle might inform a potential buyer here. I don't think there's anything to indicate PS4 is a riskier bet on that front, quite to the contrary perhaps.

If someone has specific games they want that are only available on Xbox One, and are heart set on those franchises or games, obviously that's another scenario. But if someone wants 'the best choice' in general for a console for the next 6 or 7 years, with no specific ties, Xbox is a hard sell right now.

ok honestly i think sony did some mistakes wanting last minute games (but great games as the last of us) for the ps3.(expecially gt6)..im sure if the big 3 devs that are doing the good work for sony PD,ND,SM ...ps4 would look even better than now

imagine if the last of us was a ps4 game (with next gen graphics) and they show that at the e3.......well ms could just say .."ok guys if you dont like the xb1 at least buy some windows 8 this year" :D

at the end ill buy both console as always ...but after months of leaks hardware talking i really was expecting a bigger graphical gap between this two console
so im not seeing (right now eh) sony winning this gen this easly
 

badb0y

Member
You can't compare GPU's on PC running on the same driver running on the same code to Xbox One and PS4 custom hardware.

For starters, the drivers are different for Xbox One and PS4 and are optimised for the hardware. Also the rendering pipeline and render code base will be different. And finally devs are developing on a fix platform.

I've no doubt there might be difference but your analysis method is severely flawed.

Not as flawed as Digital Foundry's and I am using the best possible data available. We can debate the intricacies of the system all we want but the fact of the matter is that the hardware limitations are already in place. No amount of optimization can overcome that unless you are saying that Sony will have less optimized drivers compared to Microsoft.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Not as flawed as Digital Foundry's
Debatable.

We can debate the intricacies of the system all we want but the fact of the matter is that the hardware limitations are already in place. No amount of optimization can overcome that unless you are saying that Sony will have less optimized drivers compared to Microsoft.

Xbox One and PS4 are not the same. Devs will have to use different techniques the get the most out of them. I do believe that the eSRAM if used correctly could significantly help performance on the Xbox One GPU.

Although on paper the PS4 looks a fair bit more powerful. I think in the real world the performance gap won't be nearly as big as folks think it will be. Certainly not the laughable 50% you've been trying to push.
 
You can't compare GPU's on PC running on the same driver running on the same code to Xbox One and PS4 custom hardware.

For starters, the drivers are different for Xbox One and PS4 and are optimised for the hardware. Also the rendering pipeline and render code base will be different. And finally devs are developing on a fix platform.

I've no doubt there might be difference but your analysis method is severely flawed.
Holy irony, Batman!

Everything you said applies just as equally to the Digital Foundry article badb0y was responding to. The only difference is, he used sane PC hardware to do the same comparison DF did. The point of his post was to show the flaws in their analysis method, as you put it.

Debatable
It really, REALLY isn't.
 

badb0y

Member
Debatable.



Xbox One and PS4 are not the same. Devs will have to use different techniques the get the most out of them. I do believe that the eSRAM if used correctly could significantly help performance on the Xbox One GPU.

Although on paper the PS4 looks a fair bit more powerful. I think in the real world the performance gap won't be nearly as big as folks think it will be. Certainly not the laughable 50% you've been trying to push.

Are you serious? I am providing raw data not some pixie dust...lol

My method is not perfect but it certainly is more accurate than Digital Foundry. The best method would be this one I mentioned in my original post:
The methodology they used would have been perfect if they grabbed a HD 7790 and clocked it to ~731 Mhz for the Xbox One's GPU and used a a HD 7850 and clocked it to 898 Mhz.

Xbox One = 1.310 Teraflops ------- (HD 7790)Bonaire @ 731 Mhz = .731*2*896 = ~1.310 Teraflops
PS4 = 1.84 Teraflops ------- (HD 7850)Pitcairn @ 898 Mhz = .898*2*1024 = ~1.84 Teraflops
 

Pistolero

Member
Sony winning or losing the gen has nothing to do with them having the superior hardware. It's a nice argument to have, and could sway some buyers, but there are more important issues (price, ease of use, games catalogue, quality of services, network features, advertising...). They're on the right track though...
As for the cotinuing support for the PS3, I don't share your view, at all. Not only does it send a signal that the company commits to its plateforms longer that their competitors do, building a trust relationship with their customers (I, for exemple, will be buying the PS4 solely on the promise of getting a steady stream of first party games from some of my favorite developers at Sony for the next 6 years, at least), but its alos a better decision, financially speaking. The PS3 consumers' base is much larger, and Sony wants those profits. Plus, the PS4 will sell regardless of the number of first party exclusives at launch. The tempting price, the good PR, the brand name and the big thrd party sellers will be carrying it for quite a few months...
 

quickwhips

Member
Sony winning or losing the gen has nothing to do with them having the superior hardware. It's a nice argument to have, and could sway some buyers, but there are more important issues (price, ease of use, games catalogue, quality of services, network features, advertising...). They're on the right track though...
As for the cotinuing support for the PS3, I don't share your view, at all. Not only does it send a signal that the company commits to its plateforms longer that their competitors do, building a trust relationship with their customers (I, for exemple, will be buying the PS4 solely on the promise of getting a steady stream of first party games from some of my favorite developers at Sony for the next 6 years, at least), but its alos a better decision, financially speaking. The PS3 consumers' base is much larger, and Sony wants those profits. Plus, the PS4 will sell regardless of the number of first party exclusives at launch. The tempting price, the good PR, the brand name and the big thrd party sellers will be carrying it for quite a few months...

Your are so correct.
 
is this underwhelming for devs that work on a console that is so much better than the other ? or just the gap will be just whta DF showed to us ..in a number of frame that u can count in two hand fingers

So what you are saying is that these systems could've been 299$ 800Gflops machines and nobody would notice the difference in visuals.
 
For the Xbox One they used a HD 7850 over a HD 7790 while for the PS4 they used a 7870 XT over a 7850. This is wrong because using the 7850 for the Xbox One thrusts it into a whole different category of graphics cards that bring many advantages not present in the Xbox One GPU (32 ROPs vs 16 ROPs).

The methodology they used would have been perfect if they grabbed a HD 7790 and clocked it to ~731 Mhz for the Xbox One's GPU and used a a HD 7850 and clocked it to 898 Mhz.
In their defense, the bandwidth differential between a 7790 and 7850 would have made it a far worse comparison. The difference between 96GB/s and ~150GB/s of bandwidth is vastly greater at 1080p than the difference between 16 and 32 ROPS.

The higher number of ROPs mainly improves MSAA performance at 1080p (rather than baseline performance without) and Eurogamer ran its tests without MSAA. It wasn't a perfect comparison but it was closer than many people realize.

Edit:

I also just noticed that your chart compares a 7770 to 7850. The 7770 has just 72GB/s of bandwidth and is even less representative because you're giving the Xbox counterpart less than half the bandwidth of the PS4. That's like an X1 without eSRAM.
 

gruenel

Member
The methodology they used would have been perfect if they grabbed a HD 7790 and clocked it to ~731 Mhz for the Xbox One's GPU and used a a HD 7850 and clocked it to 898 Mhz.

Xbox One = 1.310 Teraflops ------- (HD 7790)Bonaire @ 731 Mhz = .731*2*896 = ~1.310 Teraflops
PS4 = 1.84 Teraflops ------- (HD 7850)Pitcairn @ 898 Mhz = .898*2*1024 = ~1.84 Teraflops

I'd love to see this. Would be a much better comparison for sure.
 

Alebrije

Member
Just suppose the situation was inverse ( Xbone with better spects than PS4 )

What would be Carmack article or media in general approach?

I bet that we will have lots of articles from some sources with titles like :

"Xbox one hardware 50% better than PS4"

"MS shows a monster at E3 , Sony in panic"

DF : We tested both spects , Xbone 25% better FPS" PS4 pales !!!

"Sony overclocks PS4 CPU 6% , not enought to achieve success!!!
 

Principate

Saint Titanfall
Just suppose the situation was inverse ( Xbone with better spects than PS4 )

What would be Carmack article or media in general approach?

I bet that we will have lots of articles from some sources with titles like :

"Xbox one hardware 50% better than PS4"

"MS shows a monster at E3 , Sony in panic"

DF : We tested both spects , Xbone 25% better FPS" PS4 pales !!!

"Sony overclocks PS4 CPU 6% , not enought to achieve success!!!

Eh would probably be the same. persecution complex goes both ways.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Are you serious? I am providing raw data not some pixie dust...lol

My method is not perfect but it certainly is more accurate than Digital Foundry. The best method would be this one I mentioned in my original post:

You are comparing PC parts. When you've done a deep dive on next gen dev kit. Let me know.
 

MaLDo

Member
Holy shit, those idiots over at Digital Foundry have really fucked up this conversation, haven't they?

To preface this post I want to say that although I admire what Digital Foundry tried to do but their methodologies were not only flawed but straight up wrong.

The first thing they did wrong was select the wrong graphics cards to correctly compare the 2 situations.

For the Xbox One they used a HD 7850 over a HD 7790 while for the PS4 they used a 7870 XT over a 7850. This is wrong because using the 7850 for the Xbox One thrusts it into a whole different category of graphics cards that bring many advantages not present in the Xbox One GPU (32 ROPs vs 16 ROPs).

The methodology they used would have been perfect if they grabbed a HD 7790 and clocked it to ~731 Mhz for the Xbox One's GPU and used a a HD 7850 and clocked it to 898 Mhz.

Xbox One = 1.310 Teraflops ------- (HD 7790)Bonaire @ 731 Mhz = .731*2*896 = ~1.310 Teraflops
PS4 = 1.84 Teraflops ------- (HD 7850)Pitcairn @ 898 Mhz = .898*2*1024 = ~1.84 Teraflops

I reckon using these cards at these clocks would really show the GPU difference between the two consoles but alas Digital Foundry went about this some other dumbass way.

To get to the point since I don't have the hardware I mentioned above on hand to conduct the tests myself but I can extrapolate data from graphics cards that are similar to what goes into these consoles. For the Xbox One that would be an HD 7770 because it has a similar amount of GPU power as the GPU used in the Xbox One (1.310 Tflops vs 1.28 Tflops) while for the PS4 I would use the HD 7850 (1.84 Tflops vs 1.76 Tflops) and then compare the results for a realistic look at the performance gap. Oh, by the way the gap between 1.28(HD 7770) and 1.76(HD 7850) is smaller than the gap between 1.31(Xbox One) and 1.84(PS4) so that's already giving a slight break to the Xbox One in this comparison.

For my analysis I am using benchmark numbers from Anandtech.com:
http://anandtech.com/bench/product/777?vs=778
Using the data above I came up with the following chart:
image

I submit to you that the performance difference between the Xbox One and the PS4 will be much more apparent than what Digital Foundry has hypothesized and that the overall difference in power will not only be tangible but possible even staggering.

I can't imagine that situation in consoles. When some dev are coding a game for consoles, they want to stress all components to the max. A good multiplatform game coded for consoles, will demand the max performance peak for both cpu and gpu. It's not the same situation that in pc games, where you can have a very strong cpu + weak gpu, or rise too much gpu related graphic options (resolution, AA, shaders,...) or cpu options (shadows, LOD, distance view, ...). You as final user will be the performance balancer tweaking options or buying components.

You have to offer a similar final framerate to the users of both machines, so for a game well balanced in console word, you only can't result in better performance using a strong gpu because your cpu will be stressed to the max. Your only option is add new only graphical exclusives differences in the machine with stronger gpu so you will not affect the base cpu requeriment. And this is a very undesired situation for a multiplatform developer.

On top of that you have to think that many developments started in DX11 platform as a base. And not everyone is going to devote the necessary time to achieve an optimal port to sony libraries. Perhaps the gap in graphics power over xbone will be used to save weeks of development in the port. As with similar performance is achieved, they will close development. Although more time would have meant a better outcome for ps4.
 
Hey John, let's read the article:
http://www.eurogamer.net/articles/digitalfoundry-rage-face-off

Man , the performance gap between PS3 and 360 is less than 20%, but DF has already told a few of noticeable differences in the game.

Now the performance gap between PS4 and Xbone is more than 40% and you are telling me they are still very close?

The whole point of his point is that the difference in flops does not equate to the actual performance gap.

Mind you, that from a pure flops perspective Ps3 also had higher figures than 360, people are just ignoring it because they say that since they have the same architecture now the flops difference will matter more.
 

badb0y

Member
In their defense, the bandwidth differential between a 7790 and 7850 would have made it a far worse comparison. The difference between 96GB/s and ~150GB/s of bandwidth is vastly greater at 1080p than the difference between 16 and 32 ROPS.

The higher number of ROPs mainly improves MSAA performance at 1080p (rather than baseline performance without) and Eurogamer ran its tests without MSAA. It wasn't a perfect comparison but it was closer than many people realize.

The bandwidth between PS4 and Xbox One is not the same anyways, effective bandwidth on the PS4 is better.
You are comparing PC parts. When you've done a deep dive on next gen dev kit. Let me know.

The consoles are using PC parts. What's this obsession with trying to distance the consoles to the PC? These consoles are literally mid-ranged PCs in a box.
I can't imagine that situation in consoles. When some dev are coding a game for consoles, they want to stress all components to the max. A good multiplatform game coded for consoles, will demand the max performance peak for both cpu and gpu. It's not the same situation that in pc games, where you can have a very strong cpu + weak gpu, or rise too much gpu related graphic options (resolution, AA, shaders,...) or cpu options (shadows, LOD, distance view, ...). You as final user will be the performance balancer tweaking options or buying components.

You have to offer a similar final framerate to the users of both machines, so for a game well balanced in console word, you only can't result in better performance using a strong gpu because your cpu will be stressed to the max. Your only option is add new only graphical exclusives differences in the machine with stronger gpu so you will not affect the base cpu requeriment. And this is a very undesired situation for a multiplatform developer.

On top of that you have to think that many developments started in DX11 platform as a base. And not everyone is going to devote the necessary time to achieve an optimal port to sony libraries. Perhaps the gap in graphics power over xbone will be used to save weeks of development in the port. As with similar performance is achieved, they will close development. Although more time would have meant a better outcome for ps4.
Sorry I don't understand what you are saying, both of the consoles use the same CPU and by virtue of the architecture anything that runs on Xbox One will run better on the PS4 unless there is bottenecks in other areas.
 

TheKayle

Banned
So what you are saying is that these systems could've been 299$ 800Gflops machines and nobody would notice the difference in visuals.

what im saying is seem more easy for gaffer to talk about a uber duper 50% advantage than for devs to show in real games...or maybe the driveclub etc etc devs are not good enough..or maybe the difference is just in 8/10fps (that is a lots but not what ppl expecting is)
 
You are comparing PC parts. When you've done a deep dive on next gen dev kit. Let me know.

It comes to a point where you need to bring forth some arguments though.

PS4 will have more usable bandwidth than Xbox One, more ram available (6 according to insiders), and a significantly stronger GPU.

We are talking about two systems where one has the advantage in pretty much everything, unlike previous generations where even Xbox vs Ps2 had situations where PS2 hardware was more capable.

32 MB of ESRAM isn't something the Xbox One has over the Ps4, it's something that was put in there so that the system didn't suffer from a huge bandwidth problem. Also comparing PC parts makes sense, because they are both being compared in similar contexts.

There's nothing wrong in comparing two PC GPUS that have the same architecture of PS4 and Xbox One gpus, and are from the same manufacturer that produces the same drivers.

what im saying is seem more easy for gaffer to talk about a uber duper 50% advantage than for devs to show in real games...or maybe the driveclub etc etc devs are not good enough..or maybe the difference is just in 8/10fps (that is a lots but not what ppl expecting is)

These consoles are releasing at the same time, and you still haven't seen any multiplat side by side. I know you wish really hard that it becomes set in stone that there basically won't be any differences, because you need to feel good about it but you really need to look at things logically.

You might decide right now in your head, but reality will just be itself you know? I have no trouble understanding that come fall 2014 the discussion will be about games looking good enough, instead of games looking the same.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
The consoles are using PC parts. What's this obsession with trying to distance the consoles to the PC? These consoles are literally mid-ranged PCs in a box.
What in the same way that the Xbox360 and PS3 are using PC GPU's.

Ludicrous.
 
What in the same way that the Xbox360 and PS3 are using PC GPU's.

Ludicrous.

Well but you didn't really make a new point there did you? What about the Xbox360 and PS3 that makes it a very relevant comparison to this context?

Like the memory pool thing. You ended up not replying in the other thread, but just so we are clear the Xbox One doesn't have a split pool of DDr3 memory. Apps and Games will feed from the same pool.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
There's nothing wrong in comparing two PC GPUS that have the same architecture of PS4 and Xbox One gpus, and are from the same manufacturer that produces the same drivers.
I agree with your other points.

I don't believe it's valid trying to search for a current PC GPU that seems similar to leaked specs and then running benchmarks on them as a proxy to determine next gen console performance. It's laughable.
 

Josman

Member
The differnce will be noticeable by release, and put any doubt to sleep so we don't have to depend on articles from DF, if there is a power gap it should be there by the time they both launch.
 

twobear

sputum-flecked apoplexy
We are talking about two systems where one has the advantage in pretty much everything, unlike previous generations where even Xbox vs Ps2 had situations where PS2 hardware was more capable.

This is a really disingenuous comparison. Pretty much the single metric by which the PS2 was more capable was single-texture fillrate. In every other regard the Xbox was not only quantitatively superior but qualitatively too: it was the first home console with a GPU with pixel and vertex shaders. It could do things without breaking a sweat that the PS2 could only dream of.
 
The bandwidth between PS4 and Xbox One is not the same anyways, effective bandwidth on the PS4 is better.
Sure, but that difference is nowhere near the difference between 128-bit and 256-bit interfaces.

The HD 7770 in your chart represents a GPU with 72GB/s and compares it to a GPU with 150GB/s.
 

badb0y

Member
I agree with your other points.

I don't believe it's valid trying to search for a current PC GPU that seems similar to leaked specs and then running benchmarks on them as a proxy to determine next gen console performance. It's laughable.

What's laughable about it?

PS4 uses a Pitcairn based GPU and Xbox One uses a Bonaire based GPU, this about as close as it gets before we get the actual consoles in our hands.
Sure, but that difference is nowhere near the difference between 128-bit and 256-bit interfaces.

The HD 7770 in your chart represents a GPU with 72GB/s and compares it to a GPU with 150GB/s.
So you believe that the difference in performance comes from the difference in memory bandwidth? If you leave the core clock and OC the memory (increases bandwidth) the performance gain is so minuscule it wouldn't even change anything in the grand scheme of things. Maybe you would squeeze out 5% of extra performance but the fact of the matter is the GPU would run out of processing power way before the memory bandwidth is saturated.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Sure, but that difference is nowhere near the difference between 128-bit and 256-bit interfaces.

The HD 7770 in your chart represents a GPU with 72GB/s and compares it to a GPU with 150GB/s.

Indeed.
 
Top Bottom