• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Revisiting: Jonathan Blow devtalk "we don't expect software to work anymore"

EviLore

Expansive Ellipses
Staff Member








This is a talk from 2019, but highly relevant today. Jonathan Blow argues that we don't expect software to work anymore, and that this may precipitate the fall of civilization as we become increasingly dependent on software for mission critical operations in our day to day lives, from communication to transportation. The first example he cites is at Boeing, where pure software problems started to crash planes. Increasingly buggy, poorly optimized code is masked by the consistent strides made in hardware computational power. Products are shipped first, half-broken much of the time, and patched later. Standards have fallen precipitously, and this has become the new normal.

Give the full version or the highlight version a watch and let me know your thoughts.
 
Everything this guy has said has come to pass, from workforce culture/hiring agendas, to software now becoming buggier as the complexities rise. It's sad to see him become a pariah in the gaming community due to him reporting on facts that conflict with the overarching feelings/narratives of those working or participating in communities of said industries.

Braid was rad, too. Wish he'd return to game dev.
 
Last edited:

AV

We ain't outta here in ten minutes, we won't need no rocket to fly through space
It's only gonna get worse with every idiot coming out of college trying to use chatgpt in their jobs.

This cannot be overstated enough. ChatGPT is a ridiculously useful tool in programming but it's going to make people severely lazy. I'm already forcing myself to not just feed problems into it and copy paste the results as a default method of coding - now, I'm not fresh out of Uni, I understand what the code is doing and where it is and isn't going to work, most of the time. But it probably shouldn't be the first thing I do when I'm approaching a problem.
 

Ballthyrm

Member








This is a talk from 2019, but highly relevant today. Jonathan Blow argues that we don't expect software to work anymore, and that this may precipitate the fall of civilization as we become increasingly dependent on software for mission critical operations in our day to day lives, from communication to transportation. The first example he cites is at Boeing, where pure software problems started to crash planes. Increasingly buggy, poorly optimized code is masked by the consistent strides made in hardware computational power. Products are shipped first, half-broken much of the time, and patched later. Standards have fallen precipitously, and this has become the new normal.

Give the full version or the highlight version a watch and let me know your thoughts.


Won't stay that way for much longer. With AI we will probably get, "optimise this" button pretty soon.
Either that or we hit the hardware limit and we'll have to optimise to keep the progress going.

Right now hardware progress made people lazy, won't stay that way indefinitely.
 

StueyDuck

Member
This cannot be overstated enough. ChatGPT is a ridiculously useful tool in programming but it's going to make people severely lazy. I'm already forcing myself to not just feed problems into it and copy paste the results as a default method of coding - now, I'm not fresh out of Uni, I understand what the code is doing and where it is and isn't going to work, most of the time. But it probably shouldn't be the first thing I do when I'm approaching a problem.
Believe me I'm not a genius developer either but with 10+ years now in the industry and having still been educated in let's say "the old ways".

The dynamic shift in software to consistently pushing out releases and outsourcing to places where they really are just poorly trained to code. It becomes easy to see how messy spaghetti code is being shipped consistently these days.

The AI should really be managed by the project leads, I worry how worse applications are going to get with copy/paste code. With little understanding of the code behind
 
Last edited:

Fuz

Banned
Lol you clearly never worked as a developer in a company. Most of the issues we see today in gaming is due to the hire ups deciding to release broken shit with a ”promise” to get fixed through patches.
I don't think the focus of this discussion is games.
 
Admission standards in a lot of once renowned schools have selectively plummeted as well.
yep, the "let's get rid of grades", and push 'DEI' agendas at all costs instead of excellence, will make us go backwards as a species. There are some countries not doing this though (cough Russia, China... .and they will probably advance above the west in the years to come if this continues.

Then there is ai and other such things that will further the current issues.
 
Last edited:
Ehhh more like modern educations and modern workflows like agile are creating "good enough" as in "it doesn't crash too badly so it's ready to release" programmers rather than skilled ones.

It's only gonna get worse with every idiot coming out of college trying to use chatgpt in their jobs.
True.
I write code for a living and in my experience, the biggest culprit here is management not having a clue what quality in software dev actually means. They love to cut corners everywhere despite being warned by the programmers all the time. Doesn't matter if it's security related, data related, UI related or whatever. Half bad is good enough. Everyone wants their app, but none are willing to pay the price for it.
 

TVexperto

Member
Everything this guy has said has come to pass, from workforce culture/hiring agendas, to software now becoming buggier as the complexities rise. It's sad to see him become a pariah in the gaming community due to him reporting on facts that conflict with the overarching feelings/narratives of those working or participating in communities of said industries.

Braid was rad, too. Wish he'd return to game dev.
what did he say about workforce culture/hiring agendas? interested in hearing more about this
 

cormack12

Gold Member
I have found deteriorating code in pur suppliers products as of late. Not even hard stuff either.

I've noticed a shift to testing what we want the app to do and not testing where it could break.

Example is some software where you are part of a team and clicking into a menu takes you into a teams subscription. They never thought what would happen if the user wasn't part of a team (default account state). So it throws hundreds of api errors. The workaround? Add them to a 'fake team' as part of the account process until they patch it
 

EviLore

Expansive Ellipses
Staff Member
Just that women are less likely to be interested in software development careers. The mob went nuts on him.




Some manner of manchild loser who should not be taken seriously.

As Blow states, interest != aptitude. Software development is a particularly solitary pursuit. Women, on average, are more social, and more interested in helping people, hence bright and technically minded women tend to pursue MDs more often than engineering degrees (still plenty of women in engineering and CS though). But women are totally capable of either path and should not be discriminated against or thought of as not suited to any field.

The important takeaway from interest is that it's not a secret subliminal form of unconscious oppression if certain fields aren't 50/50 demographic splits between the sexes. At some point we'll fail to "correct" for the demographic imbalance in fields like CS, and we'll need to come to terms with the fact that it's an expression of interest.
 

recursive

Member
yep, the "let's get rid of grades", and push 'DEI' agendas at all costs instead of excellence, will make us go backwards as a species. There are some countries not doing this though (cough Russia, China... .and they will probably advance above the west in the years to come if this continues. Then there is ai and other such things that will further the current issues.
I can't wait until DEI ambassadors are replaced by AI. Really all of human resources could be.
 

Yoda

Member
Speaking as an "Big Tech" employee and as a consumer, the incentive to "do it right" isn't there most of the time.

On the former, internally companies will always reward projects which ship sooner than one which don't. This leads to an overwhelming majority of the employees focusing on short-term wins as they'll miss out on being considered promotions or in the more extreme scenarios be managed out. I've seen this behavior at 3 different large cap tech companies now; despite most of the companies having different cultures, this point is ALWAYS the same. The current "tech recession" has exacerbated it even more.

On the consumer side. I think the recent Pokemon games are a good microcosm of this problem. While I don't play them anymore, my wife is a big fan. I was dumbstruck at how bad the most recent mainline game was. It looked a generation behind lots of the games that are playable on the Switch, despite this it dipped to ~20 FPS all the time, and had bugs where she'd lose progress... Yet she still played the entire thing. If I'm not mistaken this was the Switch's best selling console exclusive of all time? If you're working at the Pokémon company and DONT shovel more crap out the door as quickly as possible, you'd probably be managed out for leaving money on the table.

I think part of the problem is corporate consolidation over the last few decades for sure (this probably explains the Boeing problem best), but tbh I'm not sure more choice would fix the issue entirely. Consumer behavior is just fucking stupid, if you don't like something... you have to abstain from giving your money/attention/w.e the company wants from you. Sadly, too many people are blinded by marketing, nostalgia, hype, etc...
 

RoboFu

One of the green rats
This cannot be overstated enough. ChatGPT is a ridiculously useful tool in programming but it's going to make people severely lazy. I'm already forcing myself to not just feed problems into it and copy paste the results as a default method of coding - now, I'm not fresh out of Uni, I understand what the code is doing and where it is and isn't going to work, most of the time. But it probably shouldn't be the first thing I do when I'm approaching a problem.
I’ve tried a few things in chat go for coding and it’s been horrible everytime but kind of works. That definitely could be a big issue.
 

Barakov

Member
Well, this Jonathan Blows.
vSGdyjf.jpg
 

DrFigs

Member
I'm still early in the video. But the discussion about lost technology from medieval and greek times is so fascinating.
 
Last edited:

ReBurn

Gold Member
Things changed when software moved to a service model and venture capital started paying for the creation of so much of it. When I started out software had to be solid because it was tough to update after it was installed. I used to fly to customer sites to install on minicomputers or mid-range boxes and set up remote handheld devices and, if necessary, modify on-site during go-live. I slept on the floor in a warehouse or office building many times pushing through implementation. If you wanted to update it you had to dial in via modem and upload new binaries and hope you could test. Or send out disks and instructions for them to

Nowadays software launches as minimally viable with bare features and evolves as companies can secure more VC funding to keep building it. That's where attitudes about it started to change. It's relatively easy to update now with practically everything being SaaS or delivered via internet. It's so easy to update that software companies have become complacent about quality.
 

Drew1440

Member
Noticed this with a lot of mobile apps, how they are very buggy and slow despite being constantly updated, I'm sure Spotify gets an update each week without any noticeable difference. It seems modern software is a complicated mixture of high-level libraries, possibly to make it portable so it can work on a wide range of devices

 

Fredrik

Member
Watched it. Very interesting talk.

He’s spot on, we’ve clearly got used to it, the slow degrade. I think we can see it though if we just stop and think back on how it used to be. Like I could turn on my C64 and load up The Last Ninja and if the power don’t go off or the computer break I think the game would still run 10 years from now. Why wouldn’t it?
And on the Amiga 1200 with Workbench (like Windows) on the HDD there is no shutdown functionality on the OS that close things in the correct way. You just flick the power button off. Nothing bad will happen. No safe mode on startup. It just doesn’t do anything in the background that you’re not telling it to do. It just display your folders and files and sit there waiting for you to start an executable file.
 

freefornow

Gold Member
45:00 minute mark

"It's actually impossible on a PC to right now render at a smooth frame rate. It is simply not possible no matter what you do..............................we dont even have that capability"

Explains alot.

Well There It Is Jurassic Park GIF


48.05 His discussion re: Unity/Unreal feels on point to a lot of what we are seeing in games released atm.
 
Last edited:

EviLore

Expansive Ellipses
Staff Member

There's a blog post that made the rounds recently...


When Twitter fired half of its employees in 2022, and most tech giants followed suit, I wasn’t surprised. In fact, I think little will change for those companies. After being employed in the tech sector for years, I have come to the conclusion that most people in tech don’t work. I don’t mean we don’t work hard; I mean we almost don’t work at all. Nada. Zilch. And when we do get to do some work, it often brings low added value to the company and its customers. All of this while being paid an amount of money some people wouldn’t even dream of.

What is happening right now in tech may be one of the greatest market inefficiencies—or even deceptions—in history. I am writing this article because I think outsiders deserve to know what’s really going on in the field.

I know my statement may sound a little bit hyperbolic—how could people be consistently paid a lot to do close to nothing? Surely that can’t be right! Well, let me share some examples from my own experience.

Five months ago, I was hired as a software developer by one of the world’s most prestigious investment banks. While I prefer to do freelance work because it involves real work, I was looking to have a bit more stability for a while, so I gave a chance to a normal corporate technology job. Since the beginning of my employment, five months ago, I’ve worked for around three hours in total (not counting non-focused Zoom meetings which I attended without paying much attention).

When I first joined the company, I was excited. However, since I joined they’ve only given me tasks that were exceedingly easy to complete, in just a few minutes, but allocating days or even weeks to them. At first, I wanted to speed things along. I genuinely wanted to build a cool product, so I connected with people across the organization to ask questions about our intended users, their needs and how our product would satisfy them. But it was soon made clear to me, a few times, that I shouldn’t do that. One person told me, “I don’t want to tell you not to ask too many questions, but…” and she basically told me not to ask too many questions.

I soon realized that the project was overstaffed and most people were pretending to work. And I also realized that was the job I was hired to do; my job was to pretend. If this had been the only time this ever happened to me, I would consider it an anomaly. Unfortunately, this has been the case with almost every tech job I’ve had for years.

Consider the case of my previous tech job, in which I was hired as a data engineer for one the world’s largest telecommunications companies. In the year and a half that I worked for them, there was only one two-week period during which I worked at full capacity. Other than that, I did almost nothing at all for the remaining 18 months. Outside those two productive weeks, most of my work involved attending irrelevant meetings, performing small tasks to pretend that a broken product worked well and even generate fake results. It felt ethically compromising and it was boring; I only worked half an hour a week or so, non-focused meetings aside.

I could go on with many other similar personal stories, but you get the gist.

This isn’t just me. All of the people I know who work in tech seem to be going through the same thing. One of my former colleagues, for example, told me all he does at work is watch Coursera courses. He’s considering resigning after the company-sponsored Coursera subscription ends. Another former colleague was hired a year ago as a data scientist for a large oil company. She’s making 200,000 pounds a year. All she does is prepare a PowerPoint presentation every week, and she’s utterly bored.

Another one of my friends was hired two years ago as a quant for one of the world’s most important investment banks (yes, that one). The interview process for that kind of job is among the hardest you can imagine—brain teasers, differential equations, graph algorithms. He was very excited at first, thinking he’d be building cutting-edge technology. However, while people always seem to appear busy from the outside, in reality he does almost nothing at all and is horribly bored but well paid.

It is hard to speak about this with others. Someone once told me that my frustration with the traditional tech workplace reminded him of Meghan Markle and Prince Harry because I kept complaining about stuff despite being in a highly desirable situation. Indeed, for a lot of people, the idea of being paid a lot to do nothing sounds like a dream come true. However, while we may not do almost any real work, we do have to constantly pretend that we do. That can be extremely frustrating and soul draining. Moreover, this shaky situation cannot last forever; it’s like a poorly balanced house of cards. With the recent massive layoffs and the collapse of SVB, the signs of strain are already there.

Big corporations have trouble getting things done. Too much bureaucracy and bloat. But they're capturing a big chunk of the talent in the industry thanks to superior compensation.
 

wipeout364

Member
There's a blog post that made the rounds recently...

[/URL][/URL]



Big corporations have trouble getting things done. Too much bureaucracy and bloat. But they're capturing a big chunk of the talent in the industry thanks to superior compensation.
Thanks for posting that, it’s just one person but crazy if that is the actual state of things. Honestly, it’s not hard to believe, I worked for a multinational for a short time and while it was not that extreme it did kind of feel like you should be doing more but there never seemed to be anything to do, so it was this weird uncomfortable feeling a lot of the time.
 
Last edited:

Hoddi

Member
This isn't just limited to video games. Even companies like Apple have been increasingly dropping the ball in recent years and even on basic security.

You can literally hijack someone's Apple ID simply by stealing their phone and knowing the PIN code. This simple thing alone allows you to both change the password and 2FA phone number. And it's been an issue for 5-6 years.
 

K' Dash

Member
Ehhh more like modern educations and modern workflows like agile are creating "good enough" as in "it doesn't crash too badly so it's ready to release" programmers rather than skilled ones.

It's only gonna get worse with every idiot coming out of college trying to use chatgpt in their jobs.

That is not Agile at all.
 

K' Dash

Member
There's a blog post that made the rounds recently...




Big corporations have trouble getting things done. Too much bureaucracy and bloat. But they're capturing a big chunk of the talent in the industry thanks to superior compensation.

I work on a Fintech that’s not that big and this is how most of the code gets pushed to prod.
 

th4tguy

Member
We had a meeting from the head of software at my work where he flat out said that studies show more features equals more money/sales. We can fix any problems after release but regular major release days need to happen bi-yearly with new functionality pack every quarter.
Close to release, we ignore bugs to make sure as many features are finished so the boxes can be checked.
Any documentation is an afterthought and commented code is practically nonexistent.
A simple one line fix for a bug took a week of me and a colleague to figure out where exactly to insert it.
Agile development is stuuupid.
 
Last edited:

Fredrik

Member
45:00 minute mark

"It's actually impossible on a PC to right now render at a smooth frame rate. It is simply not possible no matter what you do..............................we dont even have that capability"

Explains alot.

Well There It Is Jurassic Park GIF


48.05 His discussion re: Unity/Unreal feels on point to a lot of what we are seeing in games released atm.
I don’t think he’s talking about any stutterstruggle things there, just that the framerate isn’t stable, ever, and they never have full control of the tasks the computer needs to do.

So the hardware needs to be powerful enough to handle the short bursts of extra load when suddenly needed, without anybody really knowing why, and the screen needs to mask an always fluctuating framerate. It’s been like that at least since I started gaming on PC in 2014, the issue was called micro-stutters back then. I went with a gsync screen right from the start so I was never really affected by it.

Meanwhile, 35 years ago, you could play a game like Giana Sisters at an always 50fps (PAL) with a CPU running at a whooping 1MHz. And as said earlier, there was never any crashes or freeze or anything, it would just run without issues or stutters or anything until you powered off the computer.
 

Hoddi

Member
Agile development is stuuupid.

I don't really care what it's called. I've had medical applications breaking on Win10 between versions 1909 and 20H2 simply because of TLS changes. Whether it's called an 'agile workflow' is the least of my problems because both of them are simply supposed to be 'Windows 10'. And that means they're supposed to be compatible.

But they're not.
 
I've been feeling this for the last 10 or 12 years. The quality of programs and drivers, and basically all software, is slipping deeper and deeper into an ever worsening state. It's embarrassing, it's frightening, and it's rage inducing especially when it can't be avoided because it's forced on you when you want to stay up to date. What a mess.
 

I Master l

Banned
The first example he cites is at Boeing, where pure software problems started to crash planes. Increasingly buggy, poorly optimized code is masked by the consistent strides made in hardware computational power.
Boeing 787 max problem is not software like they claim it's hardware they tried to put a fuel efficient engine on an old design which made the airplane's balance all fucked up
 

IFireflyl

Gold Member
In the software development course that I just finished, we had an entire section on releasing software. I was told that software developers need to balance bug fixes with release schedules and shareholder expectations. They went on to say that there is a cost benefit to stakeholders by releasing the application when it is in a "good enough" state, and then software developers just plan for bug fixes to be released after the initial release. While I understand that it's virtually impossible to release a bug-free piece of software, I found it ridiculous that potential software developers are being advised that they need to get comfortable releasing software that has known bugs.
 

Fredrik

Member
I’m too dumb to program anything on an engine level so I think it’s alright if I keep using Unity, but I do hope some clever people who listened to Blow got a wakeup call and starts to do real low level coding again.

I’ll send the video to my brother who used to program in machine code on C64, will probably make him proud from actually understanding that and sad from realizing how little he use it and how unstable the programs are now.
 
Top Bottom