• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

"Lazy devs" - is this really an argument?

Not buying a product because it doesn't like up to your standard is totally fair, but then going to the Internet and calling the developers lazy is bullshit.
The assertion was, that without technical and production knowledge, we're not qualified to be in the conversation. The moment you try to sell a product to me, I'm in the conversation, and you are open to criticism. And in a commercial transaction, all the customer is going to see is excuses.
 

nynt9

Member
The ps3 is bad hardware as well by your twisted metrics. "It's on Sony".

I think there's a distinction between "worse hardware" and "harder-to-code-for hardware" (in this case the XB1 is both, but I'm not sure if PS3 was objectively worse, it's just so weird that it's hard to compare).
 

nakedeyes

Banned
Firstly, no, I don't really think the lazy devs comment holds much water.. There is one large technical difference between the xbone and the PS4. Most developers are going to handle that difference by bringing down the res on the XBONE version, rather than reducing IQ and/or fundamentally changing rendering techniques just for the XBONE.

That stuff equals a large art direction change and/or a huge technical shift for what would otherwise presumably be a multiplatform engine. Its not "lazy" like not wanting to take the trash out on Sunday evening.. Believe me, if there was a make-xbone-1080p checkbox they could click somewhere.. it certainly would be clicked! duh! So frustrating to hear people imply that devs are intentionally stiffing the XBONE version..

I remember the PS3 also suffered a lot because of these "lazy dev" porting job

This. Cause ( im going to be pretty high level here ) in order for PS3 to meet X360 quality visuals. You had to push some of the graphics processing onto the cell processor ( which would only work for certain graphics processing in certain pipelines as it was ). PS3 games tended to come in with a few different mindsets:
* Straight Port - They would just push the multiplatform version on the PS3, just using the somewhat underpowered GPU to match the xboxs and just using the main 2 PPC cores of the cell for CPU load. These tend to be the quick dirty early unreal engine ports that ran like shit
* Pimped out for PS3 - Usually these are exclusives ( or multiplats with PS3 as lead platform ). Usually these are very refined rendering pipelines that utilized the Cell's 6 SPUs to assist with rendering ( and/or CPU ) tasks. This is how you get awesome graphics on MGS4. let the GPU render simple geom at super high resolution, and then let the CELL help with post process effects and stuff
* In betweener - The games my studio works on are like this. Basically we start with the x360 -> PS3 dirty port, then make some general purpose job execution processes on the Cell SPUs, then push a little bit of CPU and GPU post process load to it! Its not Pimped out from top to bottom, but it def at least takes advantage of the Cell. I personally think our PS3 versions typically perform better than X360.

The PS4<->XBONE ports are different issues, and arguable, worse! Cause there are a couple options for people hoping make ANY 1080p game on XBONE ( its not even an issue with porting, but rather making ANY game on XBONE hardware ):
* Alter the entire render pipe line to fit a 1080p buffer in ESRam - the way you are going to do this is by removing render passes. Imagine cutting bump mapping or something like that. Nowadays, even a pretty barebones renderer is going to want to store 5-7+ image buffers to composite the full image. So you are asking devs to use the bare minimum or lower amount of images to composite that final image, you are asking them to remove key components that keep Image Quality up.
* Skip the ESRam! I am less sure of this one, but im pretty sure they choose to use the slower DDR3 ram to store the image buffer. But its pretty slow compared to the ESRam ( like half speed or more! ).. so maybe you will get a worse framerate. I am not 100% but I think this is how Tomb Raider:DE achieved a slightly choppy 1080p framerate ( would love any more info here if anyone has any light to shed ).
* Finally: they can get a decent framerate and their original image quality with full effects, if they just pair down the resolution of the image to fit in the ESRam. I think this is the easiest thing to do ( hence people calling devs lazy ), but its ultimately giving up the least as far as the artists vision of the game is concerned ( in my opinion ).
 
It's less an accusation by the devs of laziness than the one of the intellectual laziness of the speaker.

Devs work FUCKING HARD. It's one of the dumbest insults lobbed in gaming.
 
I think this train of thought exists not because any significant number of people actually believes it, but because there are people wholly invested in wanting others to believe that a significant number of people actually believe this sort of thing. ;)

I actually believe that's this thread's main purpose.
 
The ps3 is bad hardware as well by your twisted metrics. "It's on Sony".

No, it was badly designed. The PS3 was still overall more powerful but very few games actually took advantage. The X1 on the other hand is just bad, underpowered hardware. Devs can work all they want but it won't help until there's a new Xbox
 
Xbox One underperforming isn't the only time I've seen the "lazy devs" argument.
It also comes up occasionally if the PS4 version doesn't outperform the Xbox One version, or if there is parity.
And of course in many other situations not related to the console wars.

But anyways, no it's not a valid argument.

Developers always try to meet the production goals, but there are inherent contraints that limit the result (budget, schedule, etc).
In fact, the devs in the game industry have a reputation to be overworked.

"Lazy devs" is just an easy excuse to throw out if you're angry and have no idea how game development actually works.
 

jman2050

Member
If a completed game even gets to market in the first place then the devs are by definition not "lazy"

Of course that "completed" designation is important. With how easy it is to self-publish nowadays it's certainly possible for obviously unfinished and half-baked designs to go for sale somehow.
 
This is one of the most bullshit things gamers can spout. Not saying it never happens, but c'mon. Have some respect for the people who make this hobby possible.

Now publishers on the other hand... Lots of fodder on that side of things.
 

codecow

Member
I have worked with thousand(s) of different devs over 17 years of programming games professionally and only one that I would consider to be "lazy".

I often use the term when talking to friends about games as it's funny for anyone who actually works in the industry as I think most of them have had an experience similar to mine when it comes to their co-workers.
 
Whenever the subject of an Xbox One game performing worse than its PS4 counterpart, someone inevitably brings up the argument "The lazy devs are to blame!". For some reason, the developers are selectively lazy when it comes to the Xbox One version? And not just a few developers, but developers from Ubisoft, EA, Activision, Konami, Square Enix and a bunch of other small companies. Are all these developers lazy (considering they all had games that performed better on the PS4)?

It's a given fact that the PS4 has better hardware than the Xbox One, so I guess the argument is that "if the devs worked harder, they would be able to make the inferior hardware of the Xbox One perform identically to the PS4 version"? But then, considering the hardware differential, couldn't they make the PS4 version even better? Unless they decide to gimp the PS4 version, wouldn't there almost always be small differences between the versions? Does this train of thought even hold water? Because more time spent on development can't overcome, for example, the 32MB ESRAM being unable to hold certain types of rendering buffers in 1080p. So, doesn't that mean regardless of how hard devs work, it won't make a difference?

Basically, I'm wondering if there is anything to this argument, or if it's just mindless comments thrown out to justify the poor performance of Xbox One versions of multiplats by blaming it on devs.

Edit: A lot of people seem to be missing the bolded part.

I've actually never heard anyone using the lazy dev excuse to justify weaker console specs. My impression is that players resonate more with the relative specs of each console despite not understanding what ESRAM is and that they're more likely to blame specs rather than devs for graphically inferior ports on the XBone. The nature of high-fidelity multiplats is that they provide a superficial method of comparison between console power.
 

Eusis

Member
I remember the PS3 also suffered a lot because of these "lazy dev" porting job
I feel like that holds way more water than it ever did in relation to Xbox One: that was about figuring out how to properly utilize an esoteric CPU, while the Xbox One is straight up weaker, period. It's like claiming you're lazy for performing a task slower with a slightly sub optimal tool or something, even though you got the job done and nearly as well anyway.

With that said, even those PS3 cases I wouldn't go "lazy dev" over, I don't even want to use that argument without getting actual accounts of stuff going on at these studios or better yet personal experience. Developing AAA games isn't an easy task and I'd sooner expect that they just were not given enough time or staff to do a proper job. Do you call a baker lazy for not getting a cake done in half an hour rather than fourty minutes? It's obviously possible that they could've been given a generous time limit and STILL screwed it up when they could've easily avoided it being an issue, but most of us wouldn't personally know and is probably not very likely on major AAA releases with hundreds if not thousands of people working on them.
I have worked with thousand(s) of different devs over 17 years of programming games professionally and only one that I would consider to be "lazy".

I often use the term when talking to friends about games as it's funny for anyone who actually works in the industry as I think most of them have had an experience similar to mine when it comes to their co-workers.
Yeah, this is what I mean by needing personal experience. No way can I call someone I don't know lazy working in a field I don't have enough familiarity with, the only ones who have that right are those who actually know fully what can/can't be done and/or know the people personally, preferably both.
 

ScOULaris

Member
No, it's not an argument. Development teams are typically made up of pretty talented people working very long hours. Even bad games take a lot of work and skill to create.
 

nynt9

Member
Firstly, no, I don't really think the lazy devs comment holds much water.. There is one large technical difference between the xbone and the PS4. Most developers are going to handle that difference by bringing down the res on the XBONE version, rather than reducing IQ and/or fundamentally changing rendering techniques just for the XBONE.

That stuff equals a large art direction change and/or a huge technical shift for what would otherwise presumably be a multiplatform engine. Its not "lazy" like not wanting to take the trash out on Sunday evening.. Believe me, if there was a make-xbone-1080p checkbox they could click somewhere.. it certainly would be clicked! duh! So frustrating to hear people imply that devs are intentionally stiffing the XBONE version..



This. Cause ( im going to be pretty high level here ) in order for PS3 to meet X360 quality visuals. You had to push some of the graphics processing onto the cell processor ( which would only work for certain graphics processing in certain pipelines as it was ). PS3 games tended to come in with a few different mindsets:
* Straight Port - They would just push the multiplatform version on the PS3, just using the somewhat underpowered GPU to match the xboxs and just using the main 2 PPC cores of the cell for CPU load. These tend to be the quick dirty early unreal engine ports that ran like shit
* Pimped out for PS3 - Usually these are exclusives ( or multiplats with PS3 as lead platform ). Usually these are very refined rendering pipelines that utilized the Cell's 6 SPUs to assist with rendering ( and/or CPU ) tasks. This is how you get awesome graphics on MGS4. let the GPU render simple geom at super high resolution, and then let the CELL help with post process effects and stuff
* In betweener - The games my studio works on are like this. Basically we start with the x360 -> PS3 dirty port, then make some general purpose job execution processes on the Cell SPUs, then push a little bit of CPU and GPU post process load to it! Its not Pimped out from top to bottom, but it def at least takes advantage of the Cell. I personally think our PS3 versions typically perform better than X360.

The PS4<->XBONE ports are different issues, and arguable, worse! Cause there are a couple options for people hoping make ANY 1080p game on XBONE ( its not even an issue with porting, but rather making ANY game on XBONE hardware ):
* Alter the entire render pipe line to fit a 1080p buffer in ESRam - the way you are going to do this is by removing render passes. Imagine cutting bump mapping or something like that. Nowadays, even a pretty barebones renderer is going to want to store 5-7+ image buffers to composite the full image. So you are asking devs to use the bare minimum or lower amount of images to composite that final image, you are asking them to remove key components that keep Image Quality up.
* Skip the ESRam! I am less sure of this one, but im pretty sure they choose to use the slower DDR3 ram to store the image buffer. But its pretty slow compared to the ESRam ( like half speed or more! ).. so maybe you will get a worse framerate. I am not 100% but I think this is how Tomb Raider:DE achieved a slightly choppy 1080p framerate ( would love any more info here if anyone has any light to shed ).
* Finally: they can get a decent framerate and their original image quality with full effects, if they just pair down the resolution of the image to fit in the ESRam. I think this is the easiest thing to do ( hence people calling devs lazy ), but its ultimately giving up the least as far as the artists vision of the game is concerned ( in my opinion ).

This is pretty much how I thought it all worked, so thanks for confirming my inner speculations about how porting to Xbox One would work.
 

tanod

when is my burrito
I thought this evolved to "lazy publishers" a long time ago.

The relatively lower-performing ports led to lower sales on PS3 versions in North America. It was a self-fulfilling prophecy.
 

Orayn

Member
Firstly, no, I don't really think the lazy devs comment holds much water.. There is one large technical difference between the xbone and the PS4. Most developers are going to handle that difference by bringing down the res on the XBONE version, rather than reducing IQ and/or fundamentally changing rendering techniques just for the XBONE.

That stuff equals a large art direction change and/or a huge technical shift for what would otherwise presumably be a multiplatform engine. Its not "lazy" like not wanting to take the trash out on Sunday evening.. Believe me, if there was a make-xbone-1080p checkbox they could click somewhere.. it certainly would be clicked! duh! So frustrating to hear people imply that devs are intentionally stiffing the XBONE version..



This. Cause ( im going to be pretty high level here ) in order for PS3 to meet X360 quality visuals. You had to push some of the graphics processing onto the cell processor ( which would only work for certain graphics processing in certain pipelines as it was ). PS3 games tended to come in with a few different mindsets:
* Straight Port - They would just push the multiplatform version on the PS3, just using the somewhat underpowered GPU to match the xboxs and just using the main 2 PPC cores of the cell for CPU load. These tend to be the quick dirty early unreal engine ports that ran like shit
* Pimped out for PS3 - Usually these are exclusives ( or multiplats with PS3 as lead platform ). Usually these are very refined rendering pipelines that utilized the Cell's 6 SPUs to assist with rendering ( and/or CPU ) tasks. This is how you get awesome graphics on MGS4. let the GPU render simple geom at super high resolution, and then let the CELL help with post process effects and stuff
* In betweener - The games my studio works on are like this. Basically we start with the x360 -> PS3 dirty port, then make some general purpose job execution processes on the Cell SPUs, then push a little bit of CPU and GPU post process load to it! Its not Pimped out from top to bottom, but it def at least takes advantage of the Cell. I personally think our PS3 versions typically perform better than X360.

The PS4<->XBONE ports are different issues, and arguable, worse! Cause there are a couple options for people hoping make ANY 1080p game on XBONE ( its not even an issue with porting, but rather making ANY game on XBONE hardware ):
* Alter the entire render pipe line to fit a 1080p buffer in ESRam - the way you are going to do this is by removing render passes. Imagine cutting bump mapping or something like that. Nowadays, even a pretty barebones renderer is going to want to store 5-7+ image buffers to composite the full image. So you are asking devs to use the bare minimum or lower amount of images to composite that final image, you are asking them to remove key components that keep Image Quality up.
* Skip the ESRam! I am less sure of this one, but im pretty sure they choose to use the slower DDR3 ram to store the image buffer. But its pretty slow compared to the ESRam ( like half speed or more! ).. so maybe you will get a worse framerate. I am not 100% but I think this is how Tomb Raider:DE achieved a slightly choppy 1080p framerate ( would love any more info here if anyone has any light to shed ).
* Finally: they can get a decent framerate and their original image quality with full effects, if they just pair down the resolution of the image to fit in the ESRam. I think this is the easiest thing to do ( hence people calling devs lazy ), but its ultimately giving up the least as far as the artists vision of the game is concerned ( in my opinion ).

This is a really nice insightful post and I'd love to see more like it.
 

klaus

Member
Firstly, no, I don't really think the lazy devs comment holds much water.. There is one large technical difference between the xbone and the PS4. Most developers are going to handle that difference by bringing down the res on the XBONE version, rather than reducing IQ and/or fundamentally changing rendering techniques just for the XBONE.

That stuff equals a large art direction change and/or a huge technical shift for what would otherwise presumably be a multiplatform engine. Its not "lazy" like not wanting to take the trash out on Sunday evening.. Believe me, if there was a make-xbone-1080p checkbox they could click somewhere.. it certainly would be clicked! duh! So frustrating to hear people imply that devs are intentionally stiffing the XBONE version..



This. Cause ( im going to be pretty high level here ) in order for PS3 to meet X360 quality visuals. You had to push some of the graphics processing onto the cell processor ( which would only work for certain graphics processing in certain pipelines as it was ). PS3 games tended to come in with a few different mindsets:
* Straight Port - They would just push the multiplatform version on the PS3, just using the somewhat underpowered GPU to match the xboxs and just using the main 2 PPC cores of the cell for CPU load. These tend to be the quick dirty early unreal engine ports that ran like shit
* Pimped out for PS3 - Usually these are exclusives ( or multiplats with PS3 as lead platform ). Usually these are very refined rendering pipelines that utilized the Cell's 6 SPUs to assist with rendering ( and/or CPU ) tasks. This is how you get awesome graphics on MGS4. let the GPU render simple geom at super high resolution, and then let the CELL help with post process effects and stuff
* In betweener - The games my studio works on are like this. Basically we start with the x360 -> PS3 dirty port, then make some general purpose job execution processes on the Cell SPUs, then push a little bit of CPU and GPU post process load to it! Its not Pimped out from top to bottom, but it def at least takes advantage of the Cell. I personally think our PS3 versions typically perform better than X360.

The PS4<->XBONE ports are different issues, and arguable, worse! Cause there are a couple options for people hoping make ANY 1080p game on XBONE ( its not even an issue with porting, but rather making ANY game on XBONE hardware ):
* Alter the entire render pipe line to fit a 1080p buffer in ESRam - the way you are going to do this is by removing render passes. Imagine cutting bump mapping or something like that. Nowadays, even a pretty barebones renderer is going to want to store 5-7+ image buffers to composite the full image. So you are asking devs to use the bare minimum or lower amount of images to composite that final image, you are asking them to remove key components that keep Image Quality up.
* Skip the ESRam! I am less sure of this one, but im pretty sure they choose to use the slower DDR3 ram to store the image buffer. But its pretty slow compared to the ESRam ( like half speed or more! ).. so maybe you will get a worse framerate. I am not 100% but I think this is how Tomb Raider:DE achieved a slightly choppy 1080p framerate ( would love any more info here if anyone has any light to shed ).
* Finally: they can get a decent framerate and their original image quality with full effects, if they just pair down the resolution of the image to fit in the ESRam. I think this is the easiest thing to do ( hence people calling devs lazy ), but its ultimately giving up the least as far as the artists vision of the game is concerned ( in my opinion ).

Well put, thanks for the enlightening read! Do you have any insight if tiled rendering might become an option on XB1, with DX12 (supposedly) reducing drawcall / CPU overhead? As far as I remember the biggest problem of doing a tiled renderer was having to send the geometry twice, or am I missing something (perhaps vertex shader overhead, or perhaps the final image tiles have to be moved to DDR3, choking bandwith)?

Just asking because MS promised the same for the 360 (tiled rendering overcoming the limitations of EDRam), but to my knowledge, that never really happened..
 

Eusis

Member
Then people should say that.
Some people really are asses in that regard, and then some really don't bother to think of whether it's really a nice shorthand to use even if they view it as that rather than literal lazy developers.

Well, we can just say cheap publishers, or even more general stupid publishers. Even that can be unfair, but at least there's roots in either A. not willing to put forth the money necessary for a good product or B. overestimating what they can deliver in a certain time frame regardless of money and resources.
 
I think the problem is, no company, developer or publisher, has unlimited time and resources. In the end, faced with deadlines and the need to, you know, make money, you have to make priorities and standards.

pretty much always will those standards be less then player demands, because well, players demand pretty much everything.
 

Laughing Banana

Weeping Pickle
I don't see how you see stuff like Dragon Age 2's copy-paste dungeons or FIFA 13-14 Vita and refraining yourself from calling them lazy.
 

nynt9

Member
Well put, thanks for the enlightening read! Do you have any insight if tiled rendering might become an option on XB1, with DX12 (supposedly) reducing drawcall / CPU overhead? As far as I remember the biggest problem of doing a tiled renderer was having to send the geometry twice, or am I missing something (perhaps vertex shader overhead, or perhaps the final image tiles have to be moved to DDR3, choking bandwith)?

Just asking because MS promised the same for the 360 (tiled rendering overcoming the limitations of EDRam), but to my knowledge, that never really happened..

The problem is a 1080p framebuffer is about 6MB, and with multi-pass rendering if you do anything more than 5 passes (which is pretty common) it won't fit into the ESRAM, and the DDR3 is too slow. No amount of DX12 will increase the size of the ESRAM. You can have some workarounds but they are at beast workarounds, aka compromises.
 

KORNdoggy

Member
I see it less as "lazy" more as "less skilled"

It's all well and good saying "so-and-so developer is lazy because they couldn't match the production quality of naughty dog and last of us on PS3" but its pretty harsh. I doubt some top tier animator is sat doing a half assed job (hence lazy). It's more likely a half assed animator is doing the best he can with the time he's got.

Studio's and their abilities are not created equal. Just because ryse looks great on xbone doesn't mean that DICE where lazy with BF4 by not reaching the same IQ or visual standards. It seems crazy to make such a statement tbh.
 

Eusis

Member
I don't see how you see stuff like Dragon Age 2's copy-paste dungeons or FIFA 13-14 Vita and refraining yourself from calling them lazy.
I guess that's something of a grey area, but that's probably easier rationalized as just being dumb: EA gave too tight of a timeline for a modern AAA RPG, and Bioware perhaps decided those resources should be spent elsewhere. So rather than pare down presentation in favor of a more diverse game world they pared down design for cheap, easy copy-and-paste. There's so many parts in the machine and so many people making certain calls that I'd sooner expect some execs making unreasonable demands and the developers just doing what they can.

Also: interesting tech breakdown by nakedeyes. Guess that makes the situation more complicated than "it's just weaker period" and even makes me wonder if 8 GB GDDR5 to match the PS4 would in fact have made ports way closer, even if the CPU/GPU were still the same.
I see it less as "lazy" more as "less skilled"

It's all well and good saying "so-and-so developer is lazy because they couldn't match the production quality of naughty dog and last of us on PS3" but its pretty harsh. I doubt some top tier animator is sat doing a half assed job (hence lazy). It's more likely a half assed animator is doing the best he can with the time he's got.

Studio's and their abilities are not created equal. Just because ryse looks great on xbone doesn't mean that DICE where lazy with BF4 by not reaching the same IQ or visual standards. It seems crazy to make such a statement tbh.
I don't think Ryse versus BF4's the best comparison here, there's still the rushed development factor but ultimately it comes down to different design goals. BF4 wanted to make a large open MP shooter than ran at 60 FPS, and Crytek's SP design, despite starting similarly, has more and more become the antithesis of this: willing to toss aside even a stable 30 FPS and remotely non-linear design for a hardware pushing graphical showcase. Frostbite seems like it can look really, really damn good, and perhaps would stand toe to toe in a game with similar goals as Ryse, but BF4 didn't have those goals so it doesn't.
 

klaus

Member
The problem is a 1080p framebuffer is about 6MB, and with multi-pass rendering if you do anything more than 5 passes (which is pretty common) it won't fit into the ESRAM, and the DDR3 is too slow. No amount of DX12 will increase the size of the ESRAM. You can have some workarounds but they are at beast workarounds, aka compromises.

Yes, that's why I'm asking about tiled rendering - this technique splits the framebuffer in several tiles so they fit within the given amount of RAM, see here for 360. But I'm not aware of any 360 games that used the technique and wonder if it might be done on XB1..
 
Yes, that's why I'm asking about tiled rendering - this technique splits the framebuffer in several tiles so they fit within the given amount of RAM, see here for 360. But I'm not aware of any 360 games that used the technique and wonder if it might be done on XB1..

Many titles shipped with predicated tiling. It's not something any of us want to repeat on this generation.
 
I don't see how you see stuff like Dragon Age 2's copy-paste dungeons or FIFA 13-14 Vita and refraining yourself from calling them lazy.

Again, I'm not sure it was a case of "I could do this work, I just won't."

As someone that fucking hates Dragon Age 2, I am sure Bioware wasn't taking naps on the job. They probably worked with their time and budget given (which, you know, EA) and also made an unhealthy amount of terrible decisions.

I doubt they figured it was easier to do it this way so they could get out of the office earlier and get a sweet vacation.
 
Didn't know that it was used in several titles - can you elaborate why it's a bad solution?

From the article you cited:
In predicated tiling, the commands issued in the Draw method are recorded before execution. The recorded commands, such as DrawPrimitives calls, are then executed for each tile, predicated based on whether the rendered primitives intersect the tile.

For a title with high draw throughput this is enormously expensive on the CPU/driver stack.
 

klaus

Member
From the article you cited:


For a title with high draw throughput this is enormously expensive on the CPU/driver stack.

OK, so shouldn't DX12 at least alleviate the issue? At least they market it as improving CPU / Driver overhead.. Sorry for the OT.
 

nynt9

Member
OK, so shouldn't DX12 at least alleviate the issue? At least they market it as improving CPU / Driver overhead.. Sorry for the OT.

Possibly? I'm not certain, but remember, this is a compromise so allocating CPU cycles to this might mean that they would have to cut corners elsewhere, as iirc the PS4 also has a higher flop CPU so they would need all of their CPU power to compete as well, in addition to the GPU power.

And again, this is a slower operation than just rendering so it might affect framerates (I'd say that 1080p60fps with anything other than extremely basic rendering on the Xbox One is something that will be very rare).

We'll see what the future brings obviously, but this is my educated guess.
 
OK, so shouldn't DX12 at least alleviate the issue? At least they market it as improving CPU / Driver overhead.. Sorry for the OT.

I couldn't really answer this in sufficient detail without violating NDAs, sorry. Independent of that, I don't see predicated tiling being the optimization vector of choice moving forward. For some details on a technique that may prove more popular, I recommend:

http://www.killzone.com/en_US/blog/news/2014-03-06_regarding-killzone-shadow-fall-and-1080p.html
 

t_wilson01

Member
It's a sign of ignorance 99.9% of the time. There's no point in assuming things like laziness from the outside looking in.
Making assumptions while knowing very little about a subject is one of the things humans do best. Since the early 90s I've been reading how game development is something that you have to love doing because of how much time and effort has to be devoted to it. Sleeping at the office a few hours a day, not taking a shower for days... it's not easy work.
 

Platy

Member
Square is ALMOST making me believe in this

7O6U9F8.png


Bad cellphone ports, reusing the same tracks of the original in Curtain Call and how they padded Bravely Default, not wanting to do HD towns, re-using the same FF character for at least 3 games ...
 

LordOfChaos

Member
It always bugs me when people blame dev laziness for everything. More often than not, it's budget and time issues. Given infinite amounts of both, what a lot of people call laziness would go away.
 
If they were actually lazy they'd be out of jobs. A lot of quality developers already get sacked for depressing reasons so they're not going to half-ass something and hope no one notices.

As the PS4 continues to grow a gap and sells a higher percentage of multiplats there will be less incentive to spend extra time on X1 versions, though. That doesn't mean developers won't try if they can. It's not as though anyone lets something out the door that's deliberately not the best effort they could do in the amount of time they had.
 
It's only an argument for people who can't accept their beloved company is not perfect.

Software development in this industry ranges from working long hours to not seeing your home at all during crunch time. People who can't get things done in time won't last in this industry and everyone else is working their ass off with little sleep a lot of the time. Calling them lazy on top of that is insulting.

Another thing entirely is if the team does not have enough resources or time to do the job, or if they had to make sacrifices due to hardware limitations or whatever, but that is not their fault. You are expected to work within a certain budget based on ROI. That's not lazyness, that's reality.
 

Tain

Member
This argument breaks down when devs say "the game can't run in 60fps!" and then a quick fix by a modder makes it run at 60fps. Unless the devs did something really bad like tie the game update rate to the framerate (never do this btw, it causes weird issues when the framerate drops, amateur mistake), in which case they are bad devs.

lol, sure, Nazca and Smilebit were bad and amateur devs for not developing around potential PC ports to be released a decade or more later
 

Eusis

Member
Square is ALMOST making me believe in this

7O6U9F8.png


Bad cellphone ports, reusing the same tracks of the original in Curtain Call and how they padded Bravely Default, not wanting to do HD towns, re-using the same FF character for at least 3 games ...
Meh, it'd probably be more lazy (and funnily enough, probably better design) in Bravely Default to just not pad things out. They may've had some gameplay hour quota to meet and so worked to make sure that was hit rather than settling for short-yet-sweet. Padding out DOES require effort afterall, even if it's not as much as creating new content, especially if budget/time/space constraints prevent otherwise.
 
The question is should we blame this ignorance on those who are spewing it or does part of the blame lie with publishers who go out of their way to make game development and its hardships and intricacies a hidden secret?

I think you wouldn't hear people making these dumb statements if developers and bosses in charge of these titles some real transparency on how these products are made. I have a hard time blaming forum posters for not understanding something that is being purposefully withheld from the press.

I understand Indies have tried to make things more transparent, but developer communications with the actual customers is almost always polished nonsense that doesn't really go into any not so peachy areas of game production.

If publishers want consumers to understand arguments about how "you don't understand how much shit goes into making a game," maybe they should actually make a real attempt at building that kind of consumer knowledge.

I understand that it must suck as an individual developer reading anonymous people calling you and your team lazy, but people are human. They are emotional. They can only work with the information they have to draw these kinds of conclusions. I think people would appreciate these jobs more if there was actual consumer facing communication that presented that kind of information.

It's the kind of thing that a good union may actually negotiate for. The same with credits and other things. I think "lazy devs" is more than just a throwaway statement. It just demonstrates how limiting consumer knowledge is impacting the lives of developers.

It may make economic sense not to talk about how you are overworked in any official capacity, but that kind of transparency would make people appreciate games more. People may even begin to demand that games not be developed in a soul destroying way if they actually knew that was how games were produced. It may even drive publishers to prioritize the happiness and health of their employees more, because it's actually a crucial aspect of the games marketing.

Like a Fair Trade logo for developers...
 

Pimpwerx

Member
"Lazy devs" is a comment made by people who don't understand the complexity of software programming. It's quite an insult really. Even devs of bad ports work hard. PEACE.
 

nynt9

Member
lol, sure, Nazca and Smilebit were bad and amateur devs for not developing around potential PC ports to be released a decade or more later

Regardless of the platform you run on, when developing real time applications, it is a well-known basic rule that you do not tie your program logic to update timing - you scale every numeric value that gets updated with time by the time delta between updates. This is a fundamental principle, as you can get crazy things to happen when due to somethinf your update is faster or slower than an expected, supposedly locked rate. Not doing this is very bad practice and even amateur game developers on the game maker forums know this. To imagine that a released, professional video game can be programmed without this principle is baffling.
 

old

Member
Almost every time a PS3 port performed worse than the 360 people cried that it was due to lazy devs. We heard it for over half a decade.

Now you have a problem with it when the shoe is on the other foot.
 
Top Bottom