• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Leadbetter interviews 4A's Oles Shishkovstov about current gen consoles + PC

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Was this really necessary? Good interview overall, though.

You don't know Leadbetter then. The guys has been carrying water for MS and the Xbox brand since early in the 360 era. He will takes digs at everything Sony if he can.
 

TyrantII

Member
---


4A adapted two past-gen games with some DX11 features to consoles and yet still aimed for locked 60fps, its challenging. Definitely more challenging that what ND did.

And yet again, its not really anything to complain about ND, but locking 60fps is different philosophy and harder challenge that making Your gaming to run in 60fps most of the time.

Getting a game engine that ran on low level PPC cell architecture running indistinguishably the same in a X86 box is less challenging than porting and optimizing a very nice X86 engine to X86 fixed hardware?

I'm going to say no.

Moving all the rending out of the cell to the GPU alone was probably a shitton of work.

And that's not a dig at A4s wizardry. Its more a dig at PS3 reliance on exotic architecture and pie in the sky thinking of actual software development.
 

KKRT00

Member
Got a link i read some DF stuff but i can't say i remember seeing it .

http://www.eurogamer.net/articles/digitalfoundry-2014-nvidia-geforce-gtx-750-ti-review
http://www.eurogamer.net/articles/digitalfoundry-2014-r7-260x-vs-next-gen-console
There was also one about new DF PC based on GTX 760 and they are comparing performance to lower end PC in almost every face-off.

---
the first video is running at what seems to be 15 fps, 2nd video at a claimed 20 so its probably even lower in reality.
Both are running at around 20-25. Both also running on much higher settings than consoles versions. Both are also on very shitty PCs.

--
link to the benches on nvidias 7 series or atis x1 series?
Got some benches to nvidias 7 with CELL processor off-loading some GPU processing? Or X1 series with experimental unified shaders?
You've got flops to flops comparison and it should be enough for any comparison.
And You still havent provided even one single example [i've posted 5], but instead just talked phrases like everyone else without any solid argument.
 
This was always my take away from that quote. Because we don't really have the evidence that you require a system that is 2x as powerful as a console to play a game at the same quality and performance.

It'd be silly to argue that there aren't benefits from the console model in terms of performance, but I think what I, and many others are saying when we argue against it is not that these don't exist, but they're greatly exaggerated.

Quoting stuff in context does not make for fun console wars

You have plenty of data. The performance of every multiplatform (PC + console(s)) game ever released. You may not like what that data shows, but that's a different issue.
Again this data does not make for fun console wars

People are going to ignore your posts durante and they are eventually just going to revert to 'but naughty dog', if they haven't already
 
link to the benches on nvidias 7 series or atis x1 series?

Crysis 3 does not run on 7 or x1 series due to API limitations on the PC SKU. You can find multiple sources out there documenting x1000 series or 7000 series cards performance in Crysis 2 though.

Not in the form of listed benchmarks like tomshardware or Guru3d though. The cards are too old to be included in said comparisons (no one had them really anymore even in 2011).

On the other hand 8600 GT, which performs extremely similar to the cards you are looking for, is also documented as performing well in Crysis 2.


Finding actual numbered benches of such cards for modern games is hard. But finding their power analogues is not too hard at all. Just look at the lowest end cards that exist.
 

belmonkey

Member
crysis 2 and ME3 both suck in that regard yes, cryteks optimization is a joke on the pc and its just as bad/worse on gen 7 consoles. the bf3 video you posted is a blurry mess and its running on a processor many times faster than the gen 7 consoles which is making a huge difference. ps3 exclusives are what should be used since they were in a different league entirely from every multiplatform game excluding gta 5

Unfortunately though, those exclusives are only available on one platform, so we'll never know how a PC would have run them (probably would have ran just fine). Besides, multi-plats probably make up a good chunk of console games anyway, so why not compare those games, which actually can be compared?
 

TyrantII

Member
..yet still no question about the amount of RAM that is available to PS4 games.

Are developers under threat from Sony not to reveal it or something, If so, why the big deal?


It is a tad weird the hard number hasn't come out, but anytime anyone is asked they laugh and say they're not even close to utilizing what they have.

I'm guessing that will be the case until they finally do hit that wall and start asking for more. If that even happens.

I'm a bit concerned (but not overtly) that Sony won't be as diligent to reducing the OS footprint this time around. But they haven't failed to in the past, and there's obviously no need yet.

If anything, let's get more features first like pause/resume, dlna, and a media player.
 
http://www.eurogamer.net/articles/digitalfoundry-2014-nvidia-geforce-gtx-750-ti-review
http://www.eurogamer.net/articles/digitalfoundry-2014-r7-260x-vs-next-gen-console
There was also one about new DF PC based on GTX 760 and they are comparing performance to lower end PC in almost every face-off.

---

Both running at around 20-25. Both also running on much higher settings than consoles versions. Both are also very shitty PCs.

While those test not bad comparing launch software always going to be a problem.
We talking about the effect of API and consoles API is be in major flux during the 2 years before they came out .
They still making changes but not like how going to be before they came out .

You have plenty of data. The performance of every multiplatform (PC + console(s)) game ever released. You may not like what that data shows, but that's a different issue.

Still Durante a fair amount of that data makes no sense using it.
If people really want to see the effect of API can have on a game compare to consoles the PC spec must be as close as possible.
 

jgf

Member
You have plenty of data. The performance of every multiplatform (PC + console(s)) game ever released. You may not like what that data shows, but that's a different issue.

I don't have a "preferred outcome" for this benchmark, I treat all data equally. But I don't see the performance of a multiplatform title as a great indicator for the performance the game would have had if it were written specificly for a single platform. I don't know how much hand-tuned optimization usually goes into the release of a multi-platform title of a console.

Even if the multiplat game is perfectly optimized for my console, I basically would need to install windows on my console and try to run the pc version of the game in comparison to the console version. Only that way I can see the impact of optimization for a fixed platform.

As long a I haven't done this sort of optimization myself, I believe what the well respected developers say. As long as it sounds reasonable. If you also optimized a complete game (or any other similar software) for a specific target platform and you came to a different conclusion, then thats interesting. I don't have any reason to distrust you either. But do you say they (engine devs) are basically lying?
 
I don't have a "preferred outcome" for this benchmark, I treat all data equally. But I don't see the performance of a multiplatform title as a great indicator for the performance the game would have had if it were written specificly for a single platform. I don't know how much hand-tuned optimization usually goes into the release of a multi-platform title of a console.

Even if the multiplat game is perfectly optimized for my console, I basically would need to install windows on my console and try to run the pc version of the game in comparison to the console version. Only that way I can see the impact of optimization for a fixed platform.

As long a I haven't done this sort of optimization myself, I believe what the well respected developers say. As long as it sounds reasonable. If you also optimized a complete game (or any other similar software) for a specific target platform and you came to a different conclusion, then thats interesting. I don't have any reason to distrust you either. But do you say they (engine devs) are basically lying?

All of this sounds suspiciously like a religion
burden of proof is on the others to prove god doesn't exist, and if data that leads to that conclusion is shown it gets ignored anyhow.

@post below, calling that optimisation is just a marketing spin on 'lowering settings'
(which is exactly what most developers call it for that reason)
Optimisation is doing the same with less, not doing less with proportionally less...
e3 trailer that looks too good to be true but runs at 20 fps with constant tearing down the middle (when it's actually running on console hardware)
When asked about it dev says: we still have a lot of optimisation to do, don't worry.
read: we are going to lower the SHIT out of everything (and that tear is here to stay)
 

Alej

Banned
Let's rephrase all of this a little bit.

Design a multiplatform game. Now port it to PS4 or X1. You may be bottlenecked by CPU, or EsRAM or whatever and lose fps for your whole product. What you do? You tone down what is bottlenecking the hardware and you're good to go. It's called optimization.

But then, what if you can design the thing that is bottlenecking here differently? Let's say you can achieve the same result while coding the feature differently or, better, taking advantage of something this particular platform offers. Great! You aren't bottlenecked anymore, you don't have to tone down the effect. This is called designing your software around your hardware. It's not always possible to make things run on one hardware, so you do the things that runs great on that hardware and you are taking advantage of it.

First case, you lose performance for one thing bringing down the entire product so you degrade it to recover performance. Second case, you don't lose performance so you don't degrade it and depending of other bottlenecks, maybe you achieve double performance or better.

Now, the difference between the two cases only lies in design philosophy. Because second case requires more time and money, it's not viable for multiplatform development to design their games accounting for every platform's advantages and flaws. They just make things more scalable and that's it.

That is the kind of performance advantage hinted here (if it isn't then it's false).
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Let's not forget this post about PS4s RAM-allocation:
http://www.neogaf.com/forum/showpost.php?p=106790279&postcount=198

What_the_Hell_Are_You_Talking_.gif


How is RAM allocation relevant, did the dev say anything about the lack of RAM? They have 10x over last gen and these are ports of last gen games...
 

KKRT00

Member
While those test not bad comparing launch software always going to be a problem.
We talking about the effect of API and consoles API is be in major flux during the 2 years before they came out .
They still making changes but not like how going to be before they came out .

Sure, but i'm not from the future, sorry :)

---
Nice that they actually use 512mb of Ra.. ow wait ;)

Omg ... Full circle of conversation arrived.
 

Etnos

Banned
Counting pixel output probably isn't the best way to measure the difference between them though. There are plenty of other (and more important factors) that affect image quality besides resolution. We may push 40 per cent more pixels per frame on PS4, but it's not 40 per cent better as a result... your own eyes can tell you that.

people is never going to understand this anyway...

pretty good interview.
 
And let's not forget that programming close to the metal will usually mean that we can get 2x performance gain over the equivalent PC spec. Practically achieving that performance takes some time, though!
ikjDPsm5zVPmw.png

PMeFHea.png



bow.gif
 

Begaria

Member
Still won't quell the most....rabid XB1 owners, and a particular individual and his band of merry men (cult) in particular. Though i do await Mr X's response 8). I'm expecting more diagrams...

He won't mention it I'm sure. Does anyone else read his stuff just for laughs? It's really good satire honestly, the comments are almost as inexplicable as the posts. I hate to give the guy any credit, but damnit if it isn't a joy to read when I need some comedy in my internet browsing.

Yeaaah...they already tore through it. Unsurprisingly, they turned the whole article on its head and claim that its definitive proof that the Xbox One is in fact superior to the PS4 in development tools and hardware (DX12 still "teh game changerz"), that the PS4 is stagnating and won't improve, and that every single Xbox One multiplatform game is clearly graphically superior to the PS4 counter part. They completely glossed over the fact that Oles flat out said that the PS4 has the slightly better hardware, until a different article cropped up in which they attacked it and said it was "twisting the words Oles said."

No, I ain't makin' any of that up, quoted right from the comments of that blog:

We already knew this, but it is ONLY now MS allowing closer to the metal, so there were NOT able to get to the metal before (But the PS4 has already got to the metal)

Time to get scared Ponies.

LOL

They so badly don't want the Xbox One to be seen as different hardware as the PS4 :)

"Oles Shishkovstov: Well, similar GPU architecture is a good thing, really"

"As for the CPU - it doesn't really matter at all, as long as performance is enough. As for RAM hierarchy and its performance - it is different between platforms anyway."

Oh no, looks like he is trying to be kind to the PS4 :)

"Oles Shishkovstov: Let's put it that way - we have seen scenarios where a single CPU core was fully loaded just by issuing draw-calls on Xbox One

(and that's surely on the 'mono' driver with several fast-path calls utilised"

MONO driver :)

And here is the GOLD guys

"In general - I don't really get why they choose DX11 as a starting point for the console. It's a console! Why care about some legacy stuff at all?"

*** They started with Dx11 so they could get the console out the door!

This is why games have been slightly behind the PS4, and this is the exact reason why it will over take it and DOUBLE performance with DX12.

Can't wait for DX12 :)

It Gets better :)

"On PS4, most GPU commands are just a few DWORDs written into the command buffer, let's say just a few CPU clock cycles. On Xbox One it easily could be one million times slower because of all the bookkeeping the API does

** So it can be one million times slower with DX11 (BUT STILL HAS PARITY IN THE GAME)

But Microsoft is not sleeping, really. Each XDK that has been released both before and after the Xbox One launch has brought faster and faster draw-calls to the table. They added tons of features just to work around limitations of the DX11 API model.

*** Releasing more power in stages :)

They even made a DX12/GNM style do-it-yourself API available - although we didn't ship with it on Redux due to time constraints."

*** So they are not even using ANY to the metal stuff and are able to achieve PARITY with the PS4.

So we have a slow API (Confirmed by the DEV) competing with a 'TO THE METAL'

Time to party guys. So much GOLD and confirms what I have been saying. THIS is a MEGATON

God, I love that blog. It's a great source of entertainment.

Regardless, good on 4A on pushing out a really good version of Metro: 2033 and Last Light in such a short amount of time on each console. Can't wait to see what they can do when they really start digging into each system. It's also good to hear and reconfirm that optimizations done on the consoles directly translate into optimizations for the PC versions.
 
Oh so you believe that optimisation DOES exist after all. And youre quoting that n00b Carmack!

You should check out my post history and see if there are any inconsistencies. There aren't. My opinion has always been the same: Optimisations and coding to the metal provide some benefit, it's nowhere close to 2x in the real world, very few devs bother with optimization anyway which is why there isn't any evidence of console versions outperforming similar spec PCs.
 

derExperte

Member
You should check out my post history and see if there are any inconsistencies. There aren't. My opinion has always been the same: Optimisations and coding to the metal provide some benefit, it's nowhere close to 2x in the real world, very few devs bother with optimization anyway which is why there isn't any evidence of console versions outperforming similar spec PCs.

That's too reasonable to be included in the PC GAF lore. Try again.

Both running at around 20-25. Both also running on much higher settings than consoles versions. Both are also on very shitty PCs.

A fact that gets ignored more and more. It's almost like some are trying to revise history when downplaying these comparisons and forgetting the many, many problems lastgen games had. Even the fabled NGods couldn't do without major popins, subHD and noticeable framerate fluctuations.
 

Durante

Member
I don't have a "preferred outcome" for this benchmark, I treat all data equally. But I don't see the performance of a multiplatform title as a great indicator for the performance the game would have had if it were written specificly for a single platform. I don't know how much hand-tuned optimization usually goes into the release of a multi-platform title of a console.
You do treat all data equally -- in that you do, in fact, dismiss all available data based on flimsy argumentation, in favor of some platonic "ideally optimized" game. One which conveniently does not exist in the real world, and therefore is impossible to measure and obtain data for.

At the same time, you continuously try to shift the conversation from objective, measurable data to the opinion of individuals, and inject emotion-laden terms such as "lie" into it. Are you sure you don't have a preferred outcome?
 

Oemenia

Banned
You should check out my post history and see if there are any inconsistencies. There aren't. My opinion has always been the same: Optimisations and coding to the metal provide some benefit, it's nowhere close to 2x in the real world, very few devs bother with optimization anyway which is why there isn't any evidence of console versions outperforming similar spec PCs.
The article you posted talked about some very long lengths of time that are uneconomical for any developer. There is a lot of money to be made on consoles right now and with graphics being a large selling point I think the the economic incentive is extremely high.

But whatever, some people have warmed to the idea that fixed hardware has some advantages, my responses were mainly to people who denied it in any form whatsoever.
 

jgf

Member
All of this sounds suspiciously like a religion
burden of proof is on the others to prove god doesn't exist, and if data that leads to that conclusion is shown it gets ignored anyhow.

I'm not religious in any way. First of all these statements are pretty much impossible to proof as you never know if you could have done a better job at optimization. So I'm just trusting the opinion of an experienced developer. My world won't fall apart in case he is wrong, but I don't see how anyone can really argue about this issue without writing an engine themselves. Note that he also says that they "can usually get 2x performance gain" not that magically every game will run 2x faster.

Let's rephrase all of this a little bit.

Design a multiplatform game. Now port it to PS4 or X1. You may be bottlenecked by CPU, or EsRAM or whatever and lose fps for your whole product. What you do? You tone down what is bottlenecking the hardware and you're good to go. It's called optimization.

But then, what if you can design the thing that is bottlenecking here differently? Let's say you can achieve the same result while coding the feature differently or, better, taking advantage of something this particular platform offers. Great! You aren't bottlenecked anymore, you don't have to tone down the effect. This is called designing your software around your hardware. It's not always possible to make things run on one hardware, so you do the things that runs great on that hardware and you are taking advantage of it.

First case, you lose performance for one thing bringing down the entire product so you degrade it to recover performance. Second case, you don't lose performance so you don't degrade it and depending of other bottlenecks, maybe you achieve double performance or better.

Now, the difference between the two cases only lies in design philosophy. Because second case requires more time and money, it's not viable for multiplatform development to design their games accounting for every platform's advantages and flaws. They just make things more scalable and that's it.

That is the kind of performance advantage hinted here (if it isn't then it's false).

Yeah thats pretty much also how I interpreted it.
 
I thought this was interesting
Digital Foundry: To what extent will DX12 prove useful on Xbox One? Isn't there already a low CPU overhead there in addressing the GPU?

Oles Shishkovstov: No, it's important. All the dependency tracking takes a huge slice of CPU power. And if we are talking about the multi-threaded command buffer chunks generation - the DX11 model was essentially a 'flop', while DX12 should be the right one.
 

jgf

Member
You do treat all data equally -- in that you do, in fact, dismiss all available data based on flimsy argumentation, in favor of some platonic "ideally optimized" game. One which conveniently does not exist in the real world, and therefore is impossible to measure and obtain data for.
I laid out why I "dismiss all available data" if you have an argument against it, feel free to share. That does not mean that it is automatically proven that the 2x performance claim must be true. I merely stated that I believe what multiple engine developers said. There no need to get aggressive. I don't rule out the possibility that I'm wrong, but you shouldn't either.

At the same time, you continuously try to shift the conversation from objective, measurable data to the opinion of individuals, and inject emotion-laden terms such as "lie" into it. Are you sure you don't have a preferred outcome?
Yes I'm pretty sure. And I don't get why I'm met with such a backlash.
 

scoobs

Member
Yeaaah...they already tore through it. Unsurprisingly, they turned the whole article on its head and claim that its definitive proof that the Xbox One is in fact superior to the PS4 in development tools and hardware (DX12 still "teh game changerz"), that the PS4 is stagnating and won't improve, and that every single Xbox One multiplatform game is clearly graphically superior to the PS4 counter part. They completely glossed over the fact that Oles flat out said that the PS4 has the slightly better hardware, until a different article cropped up in which they attacked it and said it was "twisting the words Oles said."

No, I ain't makin' any of that up, quoted right from the comments of that blog:



God, I love that blog. It's a great source of entertainment.

Regardless, good on 4A on pushing out a really good version of Metro: 2033 and Last Light in such a short amount of time on each console. Can't wait to see what they can do when they really start digging into each system. It's also good to hear and reconfirm that optimizations done on the consoles directly translate into optimizations for the PC versions.
What a strange, strange group of people. Someone should do a documentary or something, it's fascinating how he's somehow convinced hundreds of people to religiously follow his.... "Work"

I'd like to understand how anyone could act like they do about a couple of electronics devices that are essentially the same damn thing.
 

Durante

Member
I laid out why I "dismiss all available data" if you have an argument against it, feel free to share.
Against what? Dismissing all available data when you are trying to establish the veracity of a claim? Do I really need to argue against that?

Basically, we have a ton of data demonstrating almost without exception that at the very least the common, simplistic interpretation of the "x2 quote" is completely inaccurate. In the absence of any other evidence, that's a pretty damn convincing state of affairs.
 
I laid out why I "dismiss all available data" if you have an argument against it, feel free to share. That does not mean that it is automatically proven that the 2x performance claim must be true. I merely stated that I believe what multiple engine developers said. There no need to get aggressive. I don't rule out the possibility that I'm wrong, but you shouldn't either.


Yes I'm pretty sure. And I don't get why I'm met with such a backlash.
john 4:180


Just like with the bible, the dogma written within is often misinterpreted
sneaky 5:42

(I'm just having fun with you jgf:p)
 

Kezen

Banned
I'd not be surprised if we needed two times more CPU power to run the game at the same settings as consoles. We will probably need more system RAM, and a bit more GPU power (10-15%).

Overall I do not think this is something the PC space should be worried about. In addition to that DX12 is coming and will most likely close the gap to a great extent between PC and consoles efficiency. When DX12 will make its way into our PC games we will need even less to compete with consoles.

Great times ahead for PC gaming.
 
A very good interview.

It's interesting to hear that they will target 30fps on their next console game. I think this will become more common as the generation matures. I was surprised when Naughty Dog said they were targeting 60fps for Uncharted 4 because the visual hit will be significant. Maybe ND goes back to 30fps for TLoU 2. Similarly, while MGS V was 60fps, perhaps Kojima-san goes back to 30fps for Silent Hills or MGS VI.

As always, 60fps will the correct choice for some games, 30fps for others. On fixed hardware platforms tradeoffs always have to be made.

There wont be a visual hit by choosing 60fps over 30fps. The visuals will only look better and 4A is making a mistake by going for 30fps when they are shown to be capable of 60fps.
 
I think that 2x quote is mostly about the CPU, like if you get an CPU for PC equivalent to the consoles CPU you would probably get 2x the perfromance on console.

But if you are not CPU bottlenecked on PC, no way are consoles going to run as well as a PC with 2x the GPU power. I can't see that happening.

This is just a guess though I'm no expert.
 

DieH@rd

Banned
2388195-9525561622-kkk56.png


So... by "Windows 8" she meant not only troubles with OS that has no place on a gaming console, but also troubles with slow DirectX 11.


I really can't think what was MS thinking when they went with DX11 as main API for Xbone...
 

coastel

Member
They are not talking about whole performance of games. And everyone should know this, because there isnt even one example confirming this claim, even from both of those guys.

---

Marginally? By running games in 1080p on higher setting and in higher framerate?
MassEffect3Demo-Performance.png


CRYSIS 2
Gamer-1900x1200.png



not trying to disagree but it should be a comaprison as the whole the the available hardware at the time the top is using a i7.

--
Or 32 multiplayer match in Battlefield 3 in 50-60fps on higher settings
https://www.youtube.com/watch?v=wzW8eS9AZNs

Thats not marginally, it destroyed past gen consoles even in the end of their cycles.

Not trying to disagree but it should be a comparison as the whole for the available hardware at the time the top is using a i7.
 

jgf

Member
Against what? Dismissing all available data when you are trying to establish the veracity of a claim? Do I really need to argue against that?

Against that comparing the performance of multiplatform titles between console and a PC with somewhat similar specs is not measuring the performance gain of the optimization for a fixed platform. And no obviously you don't need to argue, thats all optional.

All I see is an engine dev in the interview stating that they can normally achieve 2x performance gain on a fixed platform. In the interview the dev seems pretty knowledgeable and is also not talking pr bulls***. Given the complexity of modern architectures that sounds believable to me. Also I don't see a reason for him to lie just at that specific point.

He never said that all multiplatform titles run 2x faster on console.

john 4:180


Just like with the bible, the dogma written within is often misinterpreted
sneaky 5:42

(I'm just having fun with you jgf:p)

Thats ok, I laughed ;) Still if I were religious in that sense, I would say that the 2x performance claim must be true. And that no one can argue against it. I just say ---given multiple devs stating similar things--- that it may very well be true.
 

Stumpokapow

listen to the mad man
I laid out why I "dismiss all available data" if you have an argument against it, feel free to share. That does not mean that it is automatically proven that the 2x performance claim must be true. I merely stated that I believe what multiple engine developers said. There no need to get aggressive. I don't rule out the possibility that I'm wrong, but you shouldn't either.

To try to take a different tack on this: you did ask about his experience with this stuff, as though that answer was relevant to the debate (which it's not). Durante's too modest to respond so I think I'll step in for him a bit.

The person you're talking to has his PhD in Computer Science and is a researcher working in, IIRC, compiler optimization for high end parallel computing on heterogenous processors. He's most famous on the internet for writing code that fixed the PC port of Dark Souls and more recently on a tool that allows users to use supersampling with any game they want. I remember from earlier posts that he also has quite a decent amount of experience writing drivers.

So, knowing absolutely nothing about this conversation, I would say that Durante has the kind of experience profile associated with being able to make general claims on this subject, even though he doesn't work in gaming.
 
If achieving a 2x performance increase is the norm, then let's see some proof. It should't be that difficult. Digital Foundry has done a lot of current gen and PC face-offs, post some results. Like so:

http://www.eurogamer.net/articles/digitalfoundry-2014-r7-260x-vs-next-gen-console

The R260x is roughly at the same level with the PS4 GPU and it performs similarly, even at unoptimized ports like Assassins Creed 4. This is proof. Actual benchmarks. Anyone who wants to prove otherwise should also provide proof.
 

El_Chino

Member
If achieving a 2x performance increase is the norm, then let's see some proof. It should't be that difficult. Digital Foundry has done a lot of current gen and PC face-offs, post some results. Like so:

http://www.eurogamer.net/articles/digitalfoundry-2014-r7-260x-vs-next-gen-console

The R260x is roughly at the same level with the PS4 GPU and it performs similarly, even at unoptimized ports like Assassins Creed 4. This is proof. Actual benchmarks. Anyone who wants to prove otherwise should also provide proof.
Nice.
 

Durante

Member
Against that comparing the performance of multiplatform titles between console and a PC with somewhat similar specs is not measuring the performance gain of the optimization for a fixed platform. And no obviously you don't need to argue, thats all optional.
So, if you can't do it by looking at the same game on multiple platforms, how would you go about measuring this claim?

The reason I argue so vehemently against your position is that I simply don't believe that arguing on the basis of the relative prestige of whoever made a statement has much merit, especially not if there is objective data available instead. Sure, such data may not be perfect, but dismissing it out of hand seems like throwing away your best chance to make a reasoned assessment.

Basically, the problem is this. One developer can say that they get a factor of 10 performance improvement on console compared to an equivalently specced PC. Another can say they see less than 2% improvement.

Both can very well be right. At the same time. Maybe the former is looking at a draw-call limited scenario and comparing DX9 code on PC to very low level code on console. And the other is simply measuring the time it takes for his main deferred shading pixel shader to run.

And that's the true issue with the "2x" quote and its ilk: it's simply not specific enough to be of any value. It doesn't reduce down to a "truth" or a "lie" as you would seem to believe. Because of this ambiguity, these quotes are dragged out in every argument, regardless of their applicability to the scenario, component or bottleneck being discussed. What you get then is a simple appeal to authority, and discussions that go in circles and never get to the actual issues.

I hope this clarifies things.
 
You should check out my post history and see if there are any inconsistencies. There aren't. My opinion has always been the same: Optimisations and coding to the metal provide some benefit, it's nowhere close to 2x in the real world, very few devs bother with optimization anyway which is why there isn't any evidence of console versions outperforming similar spec PCs.

I agree - the extra time/ work needed to get this 2x (or whatever number it is) extra performance by coding to the metal on consoles, is simply not worth it for multiplatform releases. That's why we don't see an example of a multiplatform game on consoles running 2x better than on equivalently specced pc hardware. It's only with games built around the specs of a specific console (halo 5, uncharted etc.) that this 2x efficiency thing can be seen.
So yeah, I believe what Carmack and other developers say but understand that it is something that only applies to certain scenarios in practice.
 

coastel

Member
If achieving a 2x performance increase is the norm, then let's see some proof. It should't be that difficult. Digital Foundry has done a lot of current gen and PC face-offs, post some results. Like so:

http://www.eurogamer.net/articles/digitalfoundry-2014-r7-260x-vs-next-gen-console

The R260x is roughly at the same level with the PS4 GPU and it performs similarly, even at unoptimized ports like Assassins Creed 4. This is proof. Actual benchmarks. Anyone who wants to prove otherwise should also provide proof.

Installed in a PC featuring an Intel i7-3770k processor clocked at 4.3GHz per core, and 16GB of DDR3 RAM,

From the article you need to take into account that this is early on in the gen and also the cpu in both consoles are no where near as good as an i7 overclocked. You need to compare a chip that performs similar to make it a bit of a better comparison, and again better tools for the consoles later in the gen may give better performace than what thre getting out the consoles now. I don't agree about the 2x performance thing btw.
 
You people just use that "2x more perf quote" in fanboyish manners. Please stop, it's stupid you aren't devs or anything close to that.
Endlessly I see odd comparisons flourish here. Like comparing multiplatform games and all to prove anything.

Comparing multiplatform games between PC and consoles will always being about bruteforcing things. Why? Basically because those multiplatform games aren't designed to take advantages to platform specific hardware features (and it's the same for console wars bullshit using multiplatform software), so yes more flops = more performance here, always.

PC will always have a power advantage versus consoles, they are just limited by the tech available at a time and that's it. It's undebatable. Problems arise when you compare things that aren't very comparable, furthermore using bad examples (multiplatform games).

Yes, Crysis 3 on PS3 is exactly what you should expect of equivalent hardware on PC (if it does exist) because that kind of game is distributed among a big range of hardwares and isn't designed around the advantages (and flaws) of one hardware in particular. But a game like TLOU is something you can just dream to achieve with equivalent hardware in a PC. It doesn't say PC hardware isn't as capable, I bet if one PC developer would design a game just for that particular hardware (let's say an equivalent tflops and bandwidth config on PC than PS3) he could build a game as ambitious as TLOU because the hardware is effectively as capable.

This particular problem resides in the design philosophy. Designing a game around the hardware of a closed platform improves efficiency by a lot. You can say, to end this, that multiplatform console gaming and PC gaming have the same limiting factors. PC is just like there was plenty of other consoles around there where some have more power and some less, with games designed to run on every of them.

Ultimately, I hear every now and then that consoles are a limiting factor for PC gaming, but in fact the biggest limiting factor for high-end PC gaming is PC. If there was a "label" like "high-end PC gaming only" forming an high-end platform with devs targeting this (and only this) range of high end hardwares only, you would see things right now you never imagine your shiny hardware would be capable of today, things you won't see until the next generation of consoles, right now. I know it's frustrating and it's a shame that hardware isn't fully used and never will be because high end PC gaming is niche (in regards of big publishers) bit it's not the fault of console gaming or even budget PC gaming, it's because devs don't (or aren't economically allowed to) design their games for your hardware.

That's why, in fact, plenty of us choose the console road.

Good post - why can't we have one day without people from PcGaf and ConsoleGaf flinging shit at each other?
 
Let's rephrase all of this a little bit.

Design a multiplatform game. Now port it to PS4 or X1. You may be bottlenecked by CPU, or EsRAM or whatever and lose fps for your whole product. What you do? You tone down what is bottlenecking the hardware and you're good to go. It's called optimization.

But then, what if you can design the thing that is bottlenecking here differently? Let's say you can achieve the same result while coding the feature differently or, better, taking advantage of something this particular platform offers. Great! You aren't bottlenecked anymore, you don't have to tone down the effect. This is called designing your software around your hardware. It's not always possible to make things run on one hardware, so you do the things that runs great on that hardware and you are taking advantage of it.

First case, you lose performance for one thing bringing down the entire product so you degrade it to recover performance. Second case, you don't lose performance so you don't degrade it and depending of other bottlenecks, maybe you achieve double performance or better.

Now, the difference between the two cases only lies in design philosophy. Because second case requires more time and money, it's not viable for multiplatform development to design their games accounting for every platform's advantages and flaws. They just make things more scalable and that's it.

That is the kind of performance advantage hinted here (if it isn't then it's false).

Exactly. It's the developers ingenuity and the fact that they can focus on a single set hardware spec that enables them to get more out of consoles. That is why only first party or exclusive software truly pushes the host console to levels where a similarly specced pc will not be able to match it. With multiplatform games this isn't the case, not because developers can't, it's more they don't want to for understandable reasons.
 

omonimo

Banned
If achieving a 2x performance increase is the norm, then let's see some proof. It should't be that difficult. Digital Foundry has done a lot of current gen and PC face-offs, post some results. Like so:

http://www.eurogamer.net/articles/digitalfoundry-2014-r7-260x-vs-next-gen-console

The R260x is roughly at the same level with the PS4 GPU and it performs similarly, even at unoptimized ports like Assassins Creed 4. This is proof. Actual benchmarks. Anyone who wants to prove otherwise should also provide proof.
I though on pc you need a CPU 2 times better of the ps4 counterpart to run the same equivalent graphic in a game, most of the times. But I found the whole comparison pointless. If pc are like per like to console, why the same rig on pc comparable to the ps4 can't run the same things?
 

belmonkey

Member
Installed in a PC featuring an Intel i7-3770k processor clocked at 4.3GHz per core, and 16GB of DDR3 RAM,

From the article you need to take into account that this is early on in the gen and also the cpu in both consoles are no where near as good as an i7 overclocked. You need to compare a chip that performs similar to make it a bit of a better comparison, and again better tools for the consoles later in the gen may give better performace than what thre getting out the consoles now. I don't agree about the 2x performance thing btw.

It's not too much different on a $200 less FX 6300 (from the same article):

https://www.youtube.com/watch?v=66BiQsOM9_M

The campaign is more GPU-bound than CPU-bound.

According to this article, there isn't too much FPS difference between even a hexacore Intel CPU and a quad-core FX 4100:

CPU_01.png


And the effect of overclocking an i7?

CPU_02.png


Small.
 
Top Bottom