• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Riskit4TheBiskit says four separate sources indicated there is a Hellblade 2 port for PS5

reinking

Gold Member
GEqJsCoW0AAvXki.jpg


I did not see a lot of people asking for it on PS5 prior to this rumor. This looks more like a case that it is being offered.
 
So it sounds like there was a significant amount of irresponsibility from multiple sources, and at different levels, in Western game development. You mentioned the differences between Sony, T2 (2K), and Ubisoft for lessons learned or not. Would you be willing to go into a little more detail as to how their situations and resolutions differ?
Sure.

Ubisoft is in a unique position in that, on their balance sheets, the thing bringing in 'lean' money is ongoing revenue streams, so the few successful live services they do have up and running, ranging from Division 2, R6:Siege, etc. Legacy sales is becoming an increasingly solid source of income for them, and AC still does well in this regard. Where Ubisoft has struggled is that a bunch of titles, both GAAS and SP, just aren't hitting the sales mark they had wanted. They are placing very heavy emphasis on this year's Assassin's Creed being a big tentpole for them as a publisher, but there still looms questions on how some notable titles that came up short, whether its that Avatar game that came out late last year, or Skull & Bones underperforming, or really long in gestation titles just not really coming together (Black & White 2), is something they will be able to recover from, or if those failures are just raising the bar of success on the next AC and Farcry. Ubisoft is still very bullish on the prospect of having most of their titles act as a live-service, as well as being heavily invested in Assassin's Creed, Tom Clancy (Division, R6, etc.), and Far Cry, which a new one is dropping by the end of '25. Unlike the JP devs, Ubisoft is still very heavily invested in a handful of titles, instead of mitigating risk and putting out a bunch of smaller budgeted titles with 1 or 2 expensive tent poles meant to act as the financial lightening rods when they do drop. Ubisoft still wants most of their titles to come out in a live-service framework or fashion, believing it to be a cheaper dev pipeline, when every modern financial analysis indicates its a far more expensive development framework for a game, with far more risk. I still think some big media entity will come up with a bid for Ubisoft. The IP is solid, the Tom Clancy name is still as big as ever, and they have the manpower to turn things around, just lacking the direction needed, as well as expanding their operation into mobile, which is a big factor they are missing out on.

T2 is different in that the titles they mainly rely on, GTA and NBA, are just monster revenue earners. NBA2K largely relies on unregulated gambling. As long as Western legislators are too focused on other things there isn't really a risk that can impact NBA2K in the short term. As GTA is concerned, GTA:O/GTAV will continue to keep bringing in revenue, and GTA6 is finally around the corner for them. Overall, we're talking about a far smaller revenue in volume, but despite the outsized budget that Rockstar does get, this is very lean/profitable revenue. On the flipside, T2 is struggling with a load of other projects and publishing deals. The Embracer fallout largely also affects them since Gearbox is owned by Embracer, yet 2K holds the publishing rights to loads of Borderlands IP/publishing rights. There is another Borderlands also around the corner, and this one will, 2K hopes, prove to be more 'live-service'y than prior games in the franchise. Other titles, SP titles in particular, have struggled under 2K though at the moment. Whether its the next Bioshock being in dev hell for nearly a decade, Judas being several years late as well, Mafia both underperforming and hitting dev struggles, or their indie publishing arm running into some financial struggles as well, its very easy to see that T2 is doing what both Ubisoft, EA, and Sony did - throw money at management issues and hope that it'll get them out of the fire.

Sony is a different matter. They pivoted hard into GAAS titles, but it wasn't until it came time to talk launch strategy for some of them that the feasibility of this direction finally got called into question. Just in the last 2 years we're talking about 4 or 5 of these titles being publicly cancelled in some fashion, which is unusual for a publisher who is normally so guarded when it comes to their development practices. The lesson that they but also the larger industry is learning slower than them is that focusing on GAAS to showcase revenue growth for your VC investors is not going to work. I know there is some debate on whether or not live-service is too saturated, and we can discuss endlessly about the collapse of one of the bigger titles in the space (Destiny), or the rise of some significant live-service titles like Helldivers 2, but I think folks shouldn't discuss Helldivers 2 as being this shining example of live-service going right, and more that a team of small indie devs who operated on very lean budgets for years had a very set vision on how a small-scale live service title could work, thanks in large part to them having done it with the first game, and put all that knowledge into practice for the sequel. It can't be stressed enough just how much money it costs to run these games, outside of a Arrowhead or Hello Games situation. The revenue sounds great on paper, but when you look at some of the astronomical operating costs necessary to keep these things going, it becomes clear that some of these projects quickly enter territory where their success becomes a massive albatross, not allowing them to even begin thinking about pivoting without threatening that revenue stream and thus the survival of the studio itself.

To be fair to Sony, as the Insomniac leak showed, they were already having frank discussions internally about cost cutting measures and aiming at these insane budgets, as well as the fiscal reality of what it would mean for their team to put out live service titles. They still have 2-3 live-service titles in the pipe in the next year, and for folks who doubted their ability to launch GAAS as a console/PC exclusive, i'd argue that Fall Guys and Rocket League, both on PS4, showed both Sony and the larger industry that Sony and their playerbase are completely willing to get behind live-service/MP titles as long as the games are great and the barrier to entry is reasonable. It does sound like a bigger emphasis is now being placed on the teams to have more SP titles in production, but at smaller budget levels, and we are gonna be 2 years out minimum from seeing any fruit from that.

The big story out of the last 2 years worth of games releases is seemingly how strong SP titles can and are selling in the core games market. Yes, they don't bring in as much money as a Fortnite or a Destiny, and they don't get the engagement of a CoD, but as long as you are responsibily budgetting, they can make incredible profit. Remedy is claiming that Alan Wake 2 was a sales success for them, and I totally believe them going by how lean their development budgets can be. The other truth is, most publishers who spend it all chasing these F2P GAAS trends have only gotten burned these last 2 years. We saw Ubisoft get burned by this countless times in the last 3 years. WB is struggling with Suicide Squad and Multiversus, and who knows how those things will shape up in the future. The Finals was seemingly a success in December, then Palworld and HD2 practically erased it from the zeitgeist. And the big successes, CoD/Fortnite/Destiny, all are struggling in significant ways, despite both CoD and Fortnite having solid engagement still - they just can't keep up with the costs of making those things.
 
Last edited:
Man I've never thought about that and you're 100% right. The crazy thing is no one is really talking about this. One thing is though a lot of these Japanese devs aren't trying to push the craziest graphics they can. I think that's probably a huge reason for the cost saving. I have to imagine not hiring outside DEI companies also saves them some money. I'm assuming these Japanese devs aren't hiring companies like this.
JP devs certainly do get some DEI consultation, but its a drop in the bucket. The larger thing that you hit is that they are not chasing graphics, but I feel like this is too shallow in thinking - the issue isn't that chasing graphics is expensive, although it is. Take MS and XGS for example - outside of Hellblade 2, Microsoft has largely abandoned having these major graphical showcases, partially because theres no avoiding the XSS as your Lowest Common Denominator, but also that most of the games they are greenlighting are being done, especially now, with the budget constraints of Game Pass.

What I would argue is not that this is a graphics thing, but more that when we make games with excellent graphics, we also tend to want to make cutscenes to showcase those graphics off, and on the publisher side, they absolutely LOVE cutscenes since they are so marketable. This is a trap Sony in particular fell for - increasing the budget for cinematics on most of their titles to such an absurd degree that it irresponsibly ballooned the budget of some of the sequels, like Ragnarok or Spider-Man 2, which would've had far more reasonable budgets if their cutscene budgets were similar to their respective prequels. The thing with Eastern games is that, since they don't have all of this excess money to spend, cutscene budgets have to be very carefully used. Take Tears of the Kingdom - one of the highest selling games of last year, best selling Zelda game ever, one of the largest videogames ever produced, probably is right around the $100m budget mark, and the game has almost no voice acting, and something like 15-16 cutscenes in the game total. Ragnarok, on the other hand, has over double the amount of cinematics as GoW 2018 had - was that truly necessary? Did you need to play as Atreus and see all those cutscenes? Did it make the game sell that many more copies? I'd argue no, tbh. And it sounds like some publishing folks are beginning to agree.
 
Last edited:

T-0800

Member
JP devs certainly do get some DEI consultation, but its a drop in the bucket. The larger thing that you hit is that they are not chasing graphics, but I feel like this is too shallow in thinking - the issue isn't that chasing graphics is expensive, although it is. Take MS and XGS for example - outside of Hellblade 2, Microsoft has largely abandoned having these major graphical showcases, partially because theres no avoiding the XSS as your Lowest Common Denominator, but also that most of the games they are greenlighting are being done, especially now, with the budget constraints of Game Pass.

What I would argue is not that this is a graphics thing, but more that when we make games with excellent graphics, we also tend to want to make cutscenes to showcase those graphics off, and on the publisher side, they absolutely LOVE cutscenes since they are so marketable. This is a trap Sony in particular fell for - increasing the budget for cinematics on most of their titles to such an absurd degree that it irresponsibly ballooned the budget of some of the sequels, like Ragnarok or Spider-Man 2, which would've had far more reasonable budgets if their cutscene budgets were similar to their respective prequels. The thing with Eastern games is that, since they don't have all of this excess money to spend, cutscene budgets have to be very carefully used. Take Tears of the Kingdom - one of the highest selling games of last year, best selling Zelda game ever, one of the largest videogames ever produced, probably is right around the $100m budget mark, and the game has almost no voice acting, and something like 15-16 cutscenes in the game total. Ragnarok, on the other hand, has over double the amount of cinematics as GoW 2018 had - was that truly necessary? Did you need to play as Atreus and see all those cutscenes? Did it make the game sell that many more copies? I'd argue no, tbh. And it sounds like some publishing folks are beginning to agree.
Expensive elaborate cutscenes are an incredible waste of money. I love The Last of Us and yes the cutscenes are great but it is the gameplay that makes or breaks a game. If the Last of Us 3 had still pictures for the majority of its cutscenes I would be totally fine with that.
 

fallingdove

Member
Some weird fucking takes in this thread.

For years we have heard that development budgets were increasing faster than profits were. So what do platforms/developers do in order to meet the expectations of fans and make enough money to keep the lights on?

A) Raise the price of their games
B) Release their titles on more platforms (PC/Previous Gen/Competing Platforms)
C) Lean into GAAS
D) Introduce subscriptions

So when these companies try to make money, we all complain about a $10 increase in the price of games, the game isn’t next gen enough because it’s on previous gen, the game is appearing on a competing platform, the game has optional micro-transactions, and the platform’s game subscriptions aren’t offering enough value.

I’m not a fan of Ninja Blade and have no interest in playing another one of their games but shit — the game couldn’t have been cheap to make but because the game lacks mass market appeal Xbox is evaluating alternative revenue options in order to continue developing these titles. What do people seriously expect Xbox to do?

This whole narrative of Sony and Microsoft have failed with these consoles is kind of ridiculous. While Xbox hasn’t delivered 360 level success, they aren’t doing much worse than Xbox One. And Sony? PS1 and PS2 had very few 1st party titles that were worth remembering, I’ve had plenty of 1st party, console exclusive, and multi platform bangers to play through this gen. I don’t know why anyone would be dissatisfied with what they’ve had access to as this point.

I mean - as a kid I wouldn’t have know what to do with access to Forza Horizon 5, Horizon Forbidden West, Returnal, Tears of the Kingdom, Elden Ring, Metroid Dread, RE2 Remake, Xenoblade Chronicles 2, Final Fantasy XVI, Halo Infinite, Persona 3 Remake, Eiyuden, SF6, Tekken 8, MK1, Demons Souls Remake, Final Fantasy VII Rebirth, Granblue, Dragons Dogma 2, Diablo 4, No Rest for the Wicked… The polish, the scale, the complexity of these games dwarf anything I had access to for the first 15 years of my gaming experience.
 

Varteras

Gold Member
Sure.

Ubisoft is in a unique position in that, on their balance sheets, the thing bringing in 'lean' money is ongoing revenue streams, so the few successful live services they do have up and running, ranging from Division 2, R6:Siege, etc. Legacy sales is becoming an increasingly solid source of income for them, and AC still does well in this regard. Where Ubisoft has struggled is that a bunch of titles, both GAAS and SP, just aren't hitting the sales mark they had wanted. They are placing very heavy emphasis on this year's Assassin's Creed being a big tentpole for them as a publisher, but there still looms questions on how some notable titles that came up short, whether its that Avatar game that came out late last year, or Skull & Bones underperforming, or really long in gestation titles just not really coming together (Black & White 2), is something they will be able to recover from, or if those failures are just raising the bar of success on the next AC and Farcry. Ubisoft is still very bullish on the prospect of having most of their titles act as a live-service, as well as being heavily invested in Assassin's Creed, Tom Clancy (Division, R6, etc.), and Far Cry, which a new one is dropping by the end of '25. Unlike the JP devs, Ubisoft is still very heavily invested in a handful of titles, instead of mitigating risk and putting out a bunch of smaller budgeted titles with 1 or 2 expensive tent poles meant to act as the financial lightening rods when they do drop. Ubisoft still wants most of their titles to come out in a live-service framework or fashion, believing it to be a cheaper dev pipeline, when every modern financial analysis indicates its a far more expensive development framework for a game, with far more risk. I still think some big media entity will come up with a bid for Ubisoft. The IP is solid, the Tom Clancy name is still as big as ever, and they have the manpower to turn things around, just lacking the direction needed, as well as expanding their operation into mobile, which is a big factor they are missing out on.

T2 is different in that the titles they mainly rely on, GTA and NBA, are just monster revenue earners. NBA2K largely relies on unregulated gambling. As long as Western legislators are too focused on other things there isn't really a risk that can impact NBA2K in the short term. As GTA is concerned, GTA:O/GTAV will continue to keep bringing in revenue, and GTA6 is finally around the corner for them. Overall, we're talking about a far smaller revenue in volume, but despite the outsized budget that Rockstar does get, this is very lean/profitable revenue. On the flipside, T2 is struggling with a load of other projects and publishing deals. The Embracer fallout largely also affects them since Gearbox is owned by Embracer, yet 2K holds the publishing rights to loads of Borderlands IP/publishing rights. There is another Borderlands also around the corner, and this one will, 2K hopes, prove to be more 'live-service'y than prior games in the franchise. Other titles, SP titles in particular, have struggled under 2K though at the moment. Whether its the next Bioshock being in dev hell for nearly a decade, Judas being several years late as well, Mafia both underperforming and hitting dev struggles, or their indie publishing arm running into some financial struggles as well, its very easy to see that T2 is doing what both Ubisoft, EA, and Sony did - throw money at management issues and hope that it'll get them out of the fire.

Sony is a different matter. They pivoted hard into GAAS titles, but it wasn't until it came time to talk launch strategy for some of them that the feasibility of this direction finally got called into question. Just in the last 2 years we're talking about 4 or 5 of these titles being publicly cancelled in some fashion, which is unusual for a publisher who is normally so guarded when it comes to their development practices. The lesson that they but also the larger industry is learning slower than them is that focusing on GAAS to showcase revenue growth for your VC investors is not going to work. I know there is some debate on whether or not live-service is too saturated, and we can discuss endlessly about the collapse of one of the bigger titles in the space (Destiny), or the rise of some significant live-service titles like Helldivers 2, but I think folks shouldn't discuss Helldivers 2 as being this shining example of live-service going right, and more that a team of small indie devs who operated on very lean budgets for years had a very set vision on how a small-scale live service title could work, thanks in large part to them having done it with the first game, and put all that knowledge into practice for the sequel. It can't be stressed enough just how much money it costs to run these games, outside of a Arrowhead or Hello Games situation. The revenue sounds great on paper, but when you look at some of the astronomical operating costs necessary to keep these things going, it becomes clear that some of these projects quickly enter territory where their success becomes a massive albatross, not allowing them to even begin thinking about pivoting without threatening that revenue stream and thus the survival of the studio itself.

To be fair to Sony, as the Insomniac leak showed, they were already having frank discussions internally about cost cutting measures and aiming at these insane budgets, as well as the fiscal reality of what it would mean for their team to put out live service titles. They still have 2-3 live-service titles in the pipe in the next year, and for folks who doubted their ability to launch GAAS as a console/PC exclusive, i'd argue that Fall Guys and Rocket League, both on PS4, showed both Sony and the larger industry that Sony and their playerbase are completely willing to get behind live-service/MP titles as long as the games are great and the barrier to entry is reasonable. It does sound like a bigger emphasis is now being placed on the teams to have more SP titles in production, but at smaller budget levels, and we are gonna be 2 years out minimum from seeing any fruit from that.

The big story out of the last 2 years worth of games releases is seemingly how strong SP titles can and are selling in the core games market. Yes, they don't bring in as much money as a Fortnite or a Destiny, and they don't get the engagement of a CoD, but as long as you are responsibily budgetting, they can make incredible profit. Remedy is claiming that Alan Wake 2 was a sales success for them, and I totally believe them going by how lean their development budgets can be. The other truth is, most publishers who spend it all chasing these F2P GAAS trends have only gotten burned these last 2 years. We saw Ubisoft get burned by this countless times in the last 3 years. WB is struggling with Suicide Squad and Multiversus, and who knows how those things will shape up in the future. The Finals was seemingly a success in December, then Palworld and HD2 practically erased it from the zeitgeist. And the big successes, CoD/Fortnite/Destiny, all are struggling in significant ways, despite both CoD and Fortnite having solid engagement still - they just can't keep up with the costs of making those things.

Perhaps you can correct me on this or expand upon it, but it seems to me that the companies with the best shot at making GAAS work long term are the ones who do it like Arrowhead and not the ones who do it like Bungie. What I mean is that it seems that games like Destiny are just incredibly difficult to sustain because they pretty much require you to constantly be involved in full-time production. Where every department is constantly churning out significant work for the next big content drop. Every year, Destiny is expected to have a major expansion with multiple seasonal drops. New storylines. Cinematics. Voice overs. Tons of weapons. New subclasses. New, elaborate maps. New features or modes. Big changes to existing content. Raids. Multiple strikes and dungeons. They're practically creating a whole new game every year.

Helldivers, on the other hand, seems to be taking a much more sustainable approach of trickling in new content that just slowly builds on top of the base game. They could easily delay something behind the scenes and no one would be much the wiser. Whereas if a Destiny expansion gets delayed, it's a seismic event that throws everything off. The expectations between these two games are entirely different. There is a discussion that could be had about just their base game design. Where Destiny really requires you to constantly and more deeply invest yourself in the game to get anywhere with it, yet Helldivers is much more, excuse the expression, "hit it and quit it". You can just do a series of one-night stands with it, but Destiny wants a relationship.

Is this possibly the issue that Naughty Dog saw coming with their game? That their plan was to make such a story-focused GAAS title that it meant bogging themselves down in the same way Bungie did? I mean, they said themselves they saw an issue with the constant need for large production, but I do wonder if that issue stemmed from them taking a similar approach as Bungie did. Perhaps they would have been better served taking the Arrowhead approach. Though that might not have worked for their tastes, considering Naughty Dog is clearly a story-first studio. Of course, none of this is to say that GAAS titles like Destiny can't work long term. After all, Destiny was around for 10 years before it found itself at Death's door.

But, the line of thinking that Arrowhead is just doing it obviously right by being more pick-up-and-play might not be so accurate, either. Considering you pointed out the struggle that Fortnite is experiencing. Though maybe that game is experiencing an unrelated issue? In your opinion, what do you think a company needs to do to succeed at GAAS? Is there so much luck involved that even doing things a certain way is just too much of a crapshoot for most big teams to risk?

JP devs certainly do get some DEI consultation, but its a drop in the bucket. The larger thing that you hit is that they are not chasing graphics, but I feel like this is too shallow in thinking - the issue isn't that chasing graphics is expensive, although it is. Take MS and XGS for example - outside of Hellblade 2, Microsoft has largely abandoned having these major graphical showcases, partially because theres no avoiding the XSS as your Lowest Common Denominator, but also that most of the games they are greenlighting are being done, especially now, with the budget constraints of Game Pass.

What I would argue is not that this is a graphics thing, but more that when we make games with excellent graphics, we also tend to want to make cutscenes to showcase those graphics off, and on the publisher side, they absolutely LOVE cutscenes since they are so marketable. This is a trap Sony in particular fell for - increasing the budget for cinematics on most of their titles to such an absurd degree that it irresponsibly ballooned the budget of some of the sequels, like Ragnarok or Spider-Man 2, which would've had far more reasonable budgets if their cutscene budgets were similar to their respective prequels. The thing with Eastern games is that, since they don't have all of this excess money to spend, cutscene budgets have to be very carefully used. Take Tears of the Kingdom - one of the highest selling games of last year, best selling Zelda game ever, one of the largest videogames ever produced, probably is right around the $100m budget mark, and the game has almost no voice acting, and something like 15-16 cutscenes in the game total. Ragnarok, on the other hand, has over double the amount of cinematics as GoW 2018 had - was that truly necessary? Did you need to play as Atreus and see all those cutscenes? Did it make the game sell that many more copies? I'd argue no, tbh. And it sounds like some publishing folks are beginning to agree.

I'm glad you brought up cutscenes, because this is something my friends and I were talking about last night while we played Helldivers. The fact that the game has so many moments that COULD be cinematic but they happen while you play. You could grab clips and tell a story through your own gameplay, as plenty of people already have on Youtube or Tiktok. You'll find datapads or hear transmissions that flesh out the world around you, while also letting you imagine. You also never question that you're playing a game.

This was a similar thing that Neil Druckmann praised Elden Ring for. The world told the story. Sure, there were moments of cinematic exposition, but those were brief hype moments. Meant to pump you up for what's coming. As you said, there are some cutscenes in these games, more than some, that just did not need to happen or should have been smaller scale. You can have tense conversations in game without the need to show us how much the characters are sweating or if they plucked their eyebrows recently. I don't need to be made aware of every facial expression had in a conversation.

There is an irresponsibility in the game development of some Western devs that needs to be corrected. If you want Atreus to have a discussion with Thor, fine. You can slow the game down so that I'm still in control, but can't do much, while they speak. That doesn't require an elaborate cutscene with panning shots, putting emphasis on whoever is talking, and making sure we see even the most mundane movements.

Spider-Man 2 is a great game. But some of its cutscenes were unnecessary to tell the story while others lasted longer than they needed to. At the end of one cutscene, I didn't need a shot of Miles' covered face telling a disembodied voice that Peter isn't himself. You could have easily saved that for a conversation he has while I'm swinging around the city. Did I really need a cutscene showing Peter hanging upside down next to an enemy truck, listening to a radio transmission, just to say "I'll never make it in time"? No. You could have had me walk up to the truck myself and listen to the radio while I stand there. But for some reason, Insomniac had to be extra and remind us that Peter can hang upside down, even when there is no good reason in that moment, via a cutscene.

These, to me, are such obvious examples of how Western devs have gone increasingly overboard. With several of Sony's studios being quite guilty. Probably the largest offenders. I certainly hope that Sony and other major publishers crack down on such obvious wastes of time and money. It's one of the few things that I think Microsoft's studios have done very right. I don't think there is anything wrong with a Hellblade-style game that's very cinematic and a graphical showcase. But if that's what many of your studios are making, you have a problem. I don't think chalking it up as, "We're story tellers" is a good enough excuse. Do you see that trend being brought under control? Are budgets at a point where there is just no choice?
 
Is this possibly the issue that Naughty Dog saw coming with their game? That their plan was to make such a story-focused GAAS title that it meant bogging themselves down in the same way Bungie did?

But, the line of thinking that Arrowhead is just doing it obviously right by being more pick-up-and-play might not be so accurate, either. Considering you pointed out the struggle that Fortnite is experiencing. Though maybe that game is experiencing an unrelated issue? In your opinion, what do you think a company needs to do to succeed at GAAS? Is there so much luck involved that even doing things a certain way is just too much of a crapshoot for most big teams to risk?
So, the issues plaguing GaaS are multi-fold, but its important to understand that the larger issue affecting GaaS overall is that the focus on 'infinite growth' economics is a big driver as to why so many folks in publishing chase GaaS to begin with. What Helldivers 2 did primarily right isn't that they took the most profitable ideas or business direction from the current successes in the GaaS field, but instead created a GaaS monetization model specifically for their game. What affects so many GaaS games nowadays (Diablo 4 is a prime example of this) is that so many folks create a systemic loop for their GaaS title, with the systemic loop all meant to churn exponentially increasing revenue (fiscal types hope). Arrowhead, instead, focused on the core sandbox and gameplay loop of their game and then tailored their monetization around a direction that was sustainable for themselves and their operating costs.

What stopped both Sony and Naughty Dog from pulling the trigger was ultimately that these games become a black hole for your development teams within a company. This isn't what either Sony or Naughty Dog wanted. Ultimately, these games become victims of their own success. The operating costs and exponentially increasing development budgets that goes into just keeping the revenue flowing for one of the bigger GaaS titles like Fortnite or Destiny basically makes the all-in commitments a self-fulfilling prophecy. For all folks involved, eating those ever-increasing ongoing costs is just not sustainable, since most fiscally responsible folks know that infinite growth is impossible, even if they don't admit it. You will ultimately have to pivot your title to attract new consumers (what Fortnite did by launching smaller games within its last year) or your costs and development budgets and timelines create an environment where you can't make new stuff fast enough and your audience simply grows tired and leaves, while you have all the costs you still need to burn (Destiny is currently here).

And this isn't even speaking towards how unbelievably difficult maintaining user engagement is in modern GAAS. The market is so saturated that you can literally see communities of users who enjoy these games rotating around a few of them. Its not bringing in new users necessarily. And the bigger folks operating and controlling the GAAS market basically put as much into their games, and can do so cause their dev teams are so astronomically large that it makes it easier to pivot, but its almost impossible for a new game to come out and not get ripped off by the big dogs within months if not weeks.

These, to me, are such obvious examples of how Western devs have gone increasingly overboard. With several of Sony's studios being quite guilty. Probably the largest offenders. I certainly hope that Sony and other major publishers crack down on such obvious wastes of time and money. It's one of the few things that I think Microsoft's studios have done very right. I don't think there is anything wrong with a Hellblade-style game that's very cinematic and a graphical showcase. But if that's what many of your studios are making, you have a problem. I don't think chalking it up as, "We're story tellers" is a good enough excuse. Do you see that trend being brought under control? Are budgets at a point where there is just no choice?

Theres always a choice. One of the things you have to understand about budgets and how investment operates is that everyone just assumes funding increases generate growth increases. When SM1 was funded for $100m, that was seen as pretty high and likely very risky for a licensed IP, even if it is Spider-Man. However, it sold a tremendous amount and directly spurned console sales for the PS4. Publishers are a lot of things, but the number one thing to never lose sight of is that a publisher is ultimately a financial investor who owns the team or product they are investing in.

So for a group like Sony, you see these higher budgetted titles are selling super well and spurning on growth for console sales and SW sales in your eco, you basically come away confident that increasing those investment levels should yield a similar bump in growth, right? Well, thats how investors of all shapes and sizes operate. If Spider-Man 2 had sold 3x the amount of SM1, both on HW and SW, we probably would be having a very different conversation. Sure, its quite high still, but as an investor you'd say your bet paid off.

If you're a dev, and you get handed this much money to make a game, a part of you simply isn't gonna question it. Hey, its free money! If you've ever worked in an office or at a software team, you know that basically every department is always resource strapped. You get that much money to make your next product, suddenly every department can do the hiring they all claim they needed. Suddenly, all these ideas for cutscenes and voiced lines you wanted to do before become a feasible possibility. See, another thing publishers/investors don't like to see is money NOT being spent. Its considered a troubling sign. If I give you 3x the money you needed before, but you maybe only needed an additional 50% of your original budget, if I see you not spending $150m, i'm gonna ask why, because now I assume we won't hit our growth targets we forecasted for this title because you aren't spending 50% of the budget I gave you. So a drastic increase in budget kinda goes hand in hand with encouraging wasteful spending practices. Afterall, investment funding is meant to be used on creating the products you promised to create for it. You can't just keep that money as cash on hand - thats what the studio split on the revenue the product ultimately does generate is for.
 

Killer8

Member
I look forward to playing the definitive version later with PS5 Pro support, DualSense haptics and the Pulse headset letting me here the voices in Senua's head.
 

Varteras

Gold Member
So, the issues plaguing GaaS are multi-fold, but its important to understand that the larger issue affecting GaaS overall is that the focus on 'infinite growth' economics is a big driver as to why so many folks in publishing chase GaaS to begin with. What Helldivers 2 did primarily right isn't that they took the most profitable ideas or business direction from the current successes in the GaaS field, but instead created a GaaS monetization model specifically for their game. What affects so many GaaS games nowadays (Diablo 4 is a prime example of this) is that so many folks create a systemic loop for their GaaS title, with the systemic loop all meant to churn exponentially increasing revenue (fiscal types hope). Arrowhead, instead, focused on the core sandbox and gameplay loop of their game and then tailored their monetization around a direction that was sustainable for themselves and their operating costs.

What stopped both Sony and Naughty Dog from pulling the trigger was ultimately that these games become a black hole for your development teams within a company. This isn't what either Sony or Naughty Dog wanted. Ultimately, these games become victims of their own success. The operating costs and exponentially increasing development budgets that goes into just keeping the revenue flowing for one of the bigger GaaS titles like Fortnite or Destiny basically makes the all-in commitments a self-fulfilling prophecy. For all folks involved, eating those ever-increasing ongoing costs is just not sustainable, since most fiscally responsible folks know that infinite growth is impossible, even if they don't admit it. You will ultimately have to pivot your title to attract new consumers (what Fortnite did by launching smaller games within its last year) or your costs and development budgets and timelines create an environment where you can't make new stuff fast enough and your audience simply grows tired and leaves, while you have all the costs you still need to burn (Destiny is currently here).

And this isn't even speaking towards how unbelievably difficult maintaining user engagement is in modern GAAS. The market is so saturated that you can literally see communities of users who enjoy these games rotating around a few of them. Its not bringing in new users necessarily. And the bigger folks operating and controlling the GAAS market basically put as much into their games, and can do so cause their dev teams are so astronomically large that it makes it easier to pivot, but its almost impossible for a new game to come out and not get ripped off by the big dogs within months if not weeks.



Theres always a choice. One of the things you have to understand about budgets and how investment operates is that everyone just assumes funding increases generate growth increases. When SM1 was funded for $100m, that was seen as pretty high and likely very risky for a licensed IP, even if it is Spider-Man. However, it sold a tremendous amount and directly spurned console sales for the PS4. Publishers are a lot of things, but the number one thing to never lose sight of is that a publisher is ultimately a financial investor who owns the team or product they are investing in.

So for a group like Sony, you see these higher budgetted titles are selling super well and spurning on growth for console sales and SW sales in your eco, you basically come away confident that increasing those investment levels should yield a similar bump in growth, right? Well, thats how investors of all shapes and sizes operate. If Spider-Man 2 had sold 3x the amount of SM1, both on HW and SW, we probably would be having a very different conversation. Sure, its quite high still, but as an investor you'd say your bet paid off.

If you're a dev, and you get handed this much money to make a game, a part of you simply isn't gonna question it. Hey, its free money! If you've ever worked in an office or at a software team, you know that basically every department is always resource strapped. You get that much money to make your next product, suddenly every department can do the hiring they all claim they needed. Suddenly, all these ideas for cutscenes and voiced lines you wanted to do before become a feasible possibility. See, another thing publishers/investors don't like to see is money NOT being spent. Its considered a troubling sign. If I give you 3x the money you needed before, but you maybe only needed an additional 50% of your original budget, if I see you not spending $150m, i'm gonna ask why, because now I assume we won't hit our growth targets we forecasted for this title because you aren't spending 50% of the budget I gave you. So a drastic increase in budget kinda goes hand in hand with encouraging wasteful spending practices. Afterall, investment funding is meant to be used on creating the products you promised to create for it. You can't just keep that money as cash on hand - thats what the studio split on the revenue the product ultimately does generate is for.

It sounds an accurate summary to say that the industry just has way too many people at high levels, holding the purse strings, that have such a bad lack of market understanding. Essentially gambling with huge sums of money for the pipe dream of non-stop growth. Which you would think anyone with a sliver of financial sense would pump the brakes on. Are there any indications of course correction among these companies that are obviously barreling towards a nasty reckoning? I know you mentioned Sony woke up much sooner than most, but I suppose my concern now is a big market crash. I mean, it seems we're already in one. But, I wonder what that looks like on the other side if more companies don't wise up soon enough.

On top of that, what is the general feeling in the industry towards DEI? As always, many on the outside would be quick to point a finger at diversity hires as an issue plaguing game quality and bogging down development with, presumably, low-talent and problematic hires. It's an easy boogieman to go after, but from your position, is it actually an issue, or is that an overblown situation based more on feelings than facts? There are very obvious examples of how people who would typically be lumped into such a category of DEI have shown themselves to be of quite poor character. But, that's not necessarily indicative of the larger portion of them. I understand if you'd rather not touch this subject, as it is prone to heated conversation and possible political posts. Which are against the rules. I am curious, nonetheless.
 
Did you need to play as Atreus and see all those cutscenes? Did it make the game sell that many more copies? I'd argue no, tbh. And it sounds like some publishing folks are beginning to agree.

I wont speak for everyone else but hell no they didn't and they weren't necessary at all. I didn't realize cut scenes were so expensive though. I always imagined that was the cheaper side of gaming. Shows what I know.
 
Are there any indications of course correction among these companies that are obviously barreling towards a nasty reckoning? I know you mentioned Sony woke up much sooner than most, but I suppose my concern now is a big market crash. I mean, it seems we're already in one. But, I wonder what that looks like on the other side if more companies don't wise up soon enough.
I mean, we are effectively living through a market crash at the moment. Its not necessarily consumer facing given games simply are too profitable for them to ever go through a crash similar to the 80s, but this is what a modern crash is gonna look like given that. We are living in a reality where storied devs can't get their titles out the door and lack the direction or capital needed to do so at the highest level.

What you're gonna wind up seeing is big publishing names pivot into areas where they are already underrepresented in, at least as far as 'core' games publishers are concerned. WB and T2 focusing on mobile will be a thing. You're gonna see Ubisoft make this pivot as well. For the most part, they have a ton of growth opportunity in that sector, but there is going to be a significant number of 'new' names we are gonna familiarize ourselves with as an industry over the next decade. Lots of new teams are getting up and running, still securing funding, and have a ton of talent and experience from the bigger houses. Lots of refugees from consolidated studios started their own endeavors.

On top of that, what is the general feeling in the industry towards DEI? As always, many on the outside would be quick to point a finger at diversity hires as an issue plaguing game quality and bogging down development with, presumably, low-talent and problematic hires. It's an easy boogieman to go after, but from your position, is it actually an issue, or is that an overblown situation based more on feelings than facts? There are very obvious examples of how people who would typically be lumped into such a category of DEI have shown themselves to be of quite poor character. But, that's not necessarily indicative of the larger portion of them. I understand if you'd rather not touch this subject, as it is prone to heated conversation and possible political posts. Which are against the rules. I am curious, nonetheless.
I wouldn't necessarily agree with the notion, nor have I seen anyone in the industry posit this either, that DEI has somehow been a net negative overall for us. Obviously, there can be problems whenever you do hiring, and a lot of the corporate types have baked in certain DEI KPI metrics as a part of their bonus structure, hence why there was such a concerted push for it, but this industry has largely had hiring driven more via word of mouth/nepotism than pure 'talent' or whether they are a DEI hire. I saw a person who legitimately couldn't implement *anything* into any game they worked on fail upwards into a design director level position at a very prestigious studio, for example, and they are not an anomaly, and no - not a DEI hire whatsoever. This person did more damage to the games they were a part of than any DEI hire I have ever seen has ever done.

Certainly, in recent years, you can say that certain folks entered the industry with the sole intent of 'shaking things up', and for the most part, there are lots of things about the games industry that should be shaken up, but I feel like the entire discussion around DEI and the games industry gets away with the true issues the industry is facing, and that is specifically around talent retention. To go back to the example we were speaking of earlier, but in JP game studios, the experience age of developers at studios there are significantly higher than the experience age of devs at studios here in the Western Games industry. When I started over a decade ago, I was point blank told by folks I considered mentors and vets that you could consider yourself an industry veteran if you've worked in it for 3 years or more, and that was honestly underselling it.

The Western games industry and our ability to retain talent is just unbelievably shit. There are so many vectors to this story that should be thoroughly discussed. In fact, this is a topic that games journalists should be openly talking about all the time. Talent retention issues are some of the biggest factors contributing to the quality slump found in Western Games right now. Take a big conglomerate like MS right - their contractor policy directly harms every single one of the projects they are funding. Developers having experience working with an engine, dev workflows, pipelines, and even coworkers themselves, all go a great length towards making that team significantly bigger than the sum of their parts. When you have a policy that states that contractors can't work on a project longer than 12/18 months, and your game takes 3/4/5 years to make, that just means you have developers gaining all this talent and experience walking out the door right when they just became able to be maximized. In my opinion, most game devs are gonna take 3-6 months to getting acclimated with their new team and work flows, especially coming in as a software engineer, not to mention getting them trained to contribute towards your project if its something they haven't ever worked on before, like say taking a person whose only ever worked on puzzle games and getting them to work on Call of Duty.

Asian studios don't deal with this, because work culture in those regions are so significantly different than it is in the West. There are entire engineering teams at Capcom that have been there for over 20 years at this point. If you were to ask me to find a similar example at a Western Studio, its basically impossible to find a single person who is actually contributing towards game creation at a studio that has been at said studio/team that long, if not longer, much less an entire team. Whats worse is just how bad the batch of new hires and jr. devs who are entering not just games but all of tech are now. Over the last 4 years, i've been fortunate enough to have conducted over 250 interviews and i've had to hire over 40 folks from that pool, but seeing the state of experience and capability that the young folks just coming out of school or are even 2-3 years in the field is just absolutely terrible. I know that so many schools in the country lowered the requirements for their STEM programs to increase the graduation and job placement rates since the field has never paid out as much as it currently is, but we are definitely seeing the shit end of this from folks entering the field. I've maybe had to hire 4-6 new engineers fresh from school or were 1 year in and their experience level was absolutely non-existent. I used to think this was just anecdotal, until I read some studies and spoke to other folks in similar positions where this assertion was also coming up.
 

Varteras

Gold Member
I mean, we are effectively living through a market crash at the moment. Its not necessarily consumer facing given games simply are too profitable for them to ever go through a crash similar to the 80s, but this is what a modern crash is gonna look like given that. We are living in a reality where storied devs can't get their titles out the door and lack the direction or capital needed to do so at the highest level.

What you're gonna wind up seeing is big publishing names pivot into areas where they are already underrepresented in, at least as far as 'core' games publishers are concerned. WB and T2 focusing on mobile will be a thing. You're gonna see Ubisoft make this pivot as well. For the most part, they have a ton of growth opportunity in that sector, but there is going to be a significant number of 'new' names we are gonna familiarize ourselves with as an industry over the next decade. Lots of new teams are getting up and running, still securing funding, and have a ton of talent and experience from the bigger houses. Lots of refugees from consolidated studios started their own endeavors.


I wouldn't necessarily agree with the notion, nor have I seen anyone in the industry posit this either, that DEI has somehow been a net negative overall for us. Obviously, there can be problems whenever you do hiring, and a lot of the corporate types have baked in certain DEI KPI metrics as a part of their bonus structure, hence why there was such a concerted push for it, but this industry has largely had hiring driven more via word of mouth/nepotism than pure 'talent' or whether they are a DEI hire. I saw a person who legitimately couldn't implement *anything* into any game they worked on fail upwards into a design director level position at a very prestigious studio, for example, and they are not an anomaly, and no - not a DEI hire whatsoever. This person did more damage to the games they were a part of than any DEI hire I have ever seen has ever done.

Certainly, in recent years, you can say that certain folks entered the industry with the sole intent of 'shaking things up', and for the most part, there are lots of things about the games industry that should be shaken up, but I feel like the entire discussion around DEI and the games industry gets away with the true issues the industry is facing, and that is specifically around talent retention. To go back to the example we were speaking of earlier, but in JP game studios, the experience age of developers at studios there are significantly higher than the experience age of devs at studios here in the Western Games industry. When I started over a decade ago, I was point blank told by folks I considered mentors and vets that you could consider yourself an industry veteran if you've worked in it for 3 years or more, and that was honestly underselling it.

The Western games industry and our ability to retain talent is just unbelievably shit. There are so many vectors to this story that should be thoroughly discussed. In fact, this is a topic that games journalists should be openly talking about all the time. Talent retention issues are some of the biggest factors contributing to the quality slump found in Western Games right now. Take a big conglomerate like MS right - their contractor policy directly harms every single one of the projects they are funding. Developers having experience working with an engine, dev workflows, pipelines, and even coworkers themselves, all go a great length towards making that team significantly bigger than the sum of their parts. When you have a policy that states that contractors can't work on a project longer than 12/18 months, and your game takes 3/4/5 years to make, that just means you have developers gaining all this talent and experience walking out the door right when they just became able to be maximized. In my opinion, most game devs are gonna take 3-6 months to getting acclimated with their new team and work flows, especially coming in as a software engineer, not to mention getting them trained to contribute towards your project if its something they haven't ever worked on before, like say taking a person whose only ever worked on puzzle games and getting them to work on Call of Duty.

Asian studios don't deal with this, because work culture in those regions are so significantly different than it is in the West. There are entire engineering teams at Capcom that have been there for over 20 years at this point. If you were to ask me to find a similar example at a Western Studio, its basically impossible to find a single person who is actually contributing towards game creation at a studio that has been at said studio/team that long, if not longer, much less an entire team. Whats worse is just how bad the batch of new hires and jr. devs who are entering not just games but all of tech are now. Over the last 4 years, i've been fortunate enough to have conducted over 250 interviews and i've had to hire over 40 folks from that pool, but seeing the state of experience and capability that the young folks just coming out of school or are even 2-3 years in the field is just absolutely terrible. I know that so many schools in the country lowered the requirements for their STEM programs to increase the graduation and job placement rates since the field has never paid out as much as it currently is, but we are definitely seeing the shit end of this from folks entering the field. I've maybe had to hire 4-6 new engineers fresh from school or were 1 year in and their experience level was absolutely non-existent. I used to think this was just anecdotal, until I read some studies and spoke to other folks in similar positions where this assertion was also coming up.

So on the development side of things, we have too many studios flooded with low-skilled people. Either because (A): they are rookies who had academic or experience entry barriers lowered, (B): because they are friends or relatives of the right people who probably should have never been allowed a job, nevermind a promotion, or (C): the ones who gained that skillset left for various reasons and none of them came back. Very little of which has to do with DEI.

I'm sure this is industry-wide, but are there certain groups where this is more prevalent? Less prevalent? Microsoft was one example of having that issue because of its contract labor rules. Does the work speak for itself? As in, is it pretty clear who is doing it right and wrong? Also, is it pretty safe to say that when you reference Western groups, Sony is included in that? Seeing as they may be Japanese, but the vast bulk of their development muscle is Western. In addition, do you think they would do well to try to add more Eastern devs to their ranks or is that difficult based on the very thing you just said about company loyalty? Has there been any indication that Microsoft will relax its contract labor? Or, is that a company wide policy meant for business elements more important to them than gaming?

Hopping back over to mobile gaming, what is the future there? People often debate this, but are we heading towards a future where the hardware doesn't matter? Such as, consoles being gone, or so little of the overall market, that it's more important to have strong software offerings everywhere. Seeing Death Stranding go to mobile really made me take notice of how things could be shifting.
 
So on the development side of things, we have too many studios flooded with low-skilled people. Either because (A): they are rookies who had academic or experience entry barriers lowered, (B): because they are friends or relatives of the right people who probably should have never been allowed a job, nevermind a promotion, or (C): the ones who gained that skillset left for various reasons and none of them came back. Very little of which has to do with DEI.

I'm sure this is industry-wide, but are there certain groups where this is more prevalent? Less prevalent? Microsoft was one example of having that issue because of its contract labor rules. Does the work speak for itself? As in, is it pretty clear who is doing it right and wrong? Also, is it pretty safe to say that when you reference Western groups, Sony is included in that? Seeing as they may be Japanese, but the vast bulk of their development muscle is Western. In addition, do you think they would do well to try to add more Eastern devs to their ranks or is that difficult based on the very thing you just said about company loyalty? Has there been any indication that Microsoft will relax its contract labor? Or, is that a company wide policy meant for business elements more important to them than gaming?
I do include Sony in that, but they are sort of this middle-ground in that loads of folks at Sony studios, and yes I am referring to the Western ones here primarily, have a far longer tenure than you'd see with most other Western game dev groups. Ofc, they are also vulnerable to issues that make game industry workers more likely to leave, such as poor work/life balance compared to other Western tech jobs, or the lower wages compared to other Western tech jobs, as immediate examples. It does sound like that within Sony, there has been a bit of a cultural back-and-forth battle, with the pendulum swinging back towards how their Eastern counter parts do business, but its early days and we have years to go before we see any results from that. Sony largely operates around profit and not growth (recent times notwithstanding), so it'll be nice to see what they do with this hopeful return to form.

Microsoft has absolutely no intention of relaxing their contractor policy. The games still generate ridiculous amounts of revenue, they get significant government kick backs for their contractor policy, and we simply have no idea what MS' overhead would look like if they were to bring that talent in-house where it belongs. The contractor policy also makes it far harder to go work at MS, since when you go in, you know you have a hard deadline looming. Imagine taking a job knowing you're gonna get laid off in 12/18 months and there was nothing you could do about it.

Hopping back over to mobile gaming, what is the future there? People often debate this, but are we heading towards a future where the hardware doesn't matter? Such as, consoles being gone, or so little of the overall market, that it's more important to have strong software offerings everywhere. Seeing Death Stranding go to mobile really made me take notice of how things could be shifting.
Hardware will absolutely matter, and for the foreseeable future. One of the factors that climate change will affect is our communications networks, so any hope that we can simply iterate around the bandwidth issues that stop Cloud from taking off are purely pipedreams at this point. Not just that, but theres also a massive consumer psychology aspect to this - consumers, now more than ever, despite the absolute prevalence and cost-savings that could be garnered from the current Cloud gaming solutions on offer are soundly rejecting them at every turn. The biggest investor in this, Microsoft, is even backing off of this. Consumers are now far more willing to go ahead and buy electronic devices to natively render software, whether that is PC gaming being at its absolute height in popularity and growing, the emergence of natively-rendered mid-level PC mobile gaming (Steam Deck), or just how big/popular mobile gaming is as a whole.

This isn't really a talk about Cloud though - lots of folks just conflate it. The big innovation we are seeing play out right now is how successful chip makers are getting at taking more sophisticated tech and making them smaller and smaller. Seeing some of the 'core' games now heading towards mobile devices, with even more coming in the not too distant future, is going to blur these lines even further. The hardware will always matter, but as the industry is increasingly hitting the breaks at chasing graphics at the sake of budget, we are gonna see the mobile sector make APUs that are feature rich enough to run new-ish games on modest settings.

There is a caveat to this though: everything i've discussed to this point is still plainly about 'core' games. When we talk about mobile games, you need to keep in mind that mobile gaming experiences are very, very different affairs to 'core' gaming experiences, and while the audience that plays one (core games) might play the other, research suggests this is not a two-way street; there are a proverbial truck load of mobile gamers who do not, and likely won't ever, play 'core' games. Thats perfectly okay though - in the words of Arrowhead Games: "A game for everyone is a game for no one.". However, mobile games are still very important to the health of 'core' games. There is still a ton of research suggesting that both Nintendo and mobile games are still vastly responsible for gamers ultimately getting into 'core' games. Yes, the audience for core games pales compared to the sizable audience for mobile games, but core gamers spend far, far more than mobile gamers do on average, so its just a numbers game regarding business plans.

I believe we are going to see a future where games are still made tailored for the device they are meant for (mobile games targetting android/iphone primarily - core games targetting PC/console) instead of seeing folks make games and just make a variety of I/O configs. I think one of the growth areas you will see publishers chase is going to be getting some of their back catalog items running natively on mobile devices, similar to Death Stranding. At first, this is gonna be seen as further monetizing games that are a part of the back catalog of a publisher or developer. Just a means to generate income on games that have long stopped generating revenue on the current devices they are offered on. The real test will be when a big, AAA game launches both on mobile and on core platforms, day 1. I've yet to hear of the game that will do this, but I know this day is coming, with a few attempts i'm sure, and the results of which will begin setting the course of the future of the industry.
 
Last edited:

nial

Gold Member
I do include Sony in that, but they are sort of this middle-ground in that loads of folks at Sony studios, and yes I am referring to the Western ones here primarily, have a far longer tenure than you'd see with most other Western game dev groups.
Funnily enough, their Japanese departments always went hard in their contractor practices. It's pretty much why a sequel to say, Ape Escape, could have a completely different development team from the first game.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
So on the development side of things, we have too many studios flooded with low-skilled people. Either because (A): they are rookies who had academic or experience entry barriers lowered, (B): because they are friends or relatives of the right people who probably should have never been allowed a job, nevermind a promotion, or (C): the ones who gained that skillset left for various reasons and none of them came back. Very little of which has to do with DEI.

I'm sure this is industry-wide, but are there certain groups where this is more prevalent? Less prevalent? Microsoft was one example of having that issue because of its contract labor rules. Does the work speak for itself? As in, is it pretty clear who is doing it right and wrong? Also, is it pretty safe to say that when you reference Western groups, Sony is included in that? Seeing as they may be Japanese, but the vast bulk of their development muscle is Western. In addition, do you think they would do well to try to add more Eastern devs to their ranks or is that difficult based on the very thing you just said about company loyalty? Has there been any indication that Microsoft will relax its contract labor? Or, is that a company wide policy meant for business elements more important to them than gaming?

Hopping back over to mobile gaming, what is the future there? People often debate this, but are we heading towards a future where the hardware doesn't matter? Such as, consoles being gone, or so little of the overall market, that it's more important to have strong software offerings everywhere. Seeing Death Stranding go to mobile really made me take notice of how things could be shifting.

Very good questions and answers on DEI and what it is and isn't doing to the industry. It feels like it will be 15 years until mobile phones can output graphics and physics good enough to make it so you wouldn't need a console to play a core game very very well. I'm talking PlayStation 8 in the year 2040.

At some point the mobile phone will be able to make games that can be rendered at 1080p but with the use of AI output it at 4K while also having full path tracing Ray tracing and running at 60 frames per second with high level of physics. Once we get there I don't know where else we can go.
 
Top Bottom