Executing Software Engineers for Bugs
When people bring up a code of ethics, actuaries seem to come up a lot because they can potentially lose their license for mis-evaluating risks, but anecdotally what I've heard is that in reality if your boss asks you to sign off on something, and you refuse they'll find someone else to sign off on it and just label you as not a team player. At which point you've harmed your career and had no real impact.
I bring this up because when people propose a code of ethics, they assume it will help, but I'm not really convinced that we will really get to a point where codes of ethics are widespread and taken seriously enough that they will do anything but harm the altruists among us.
>But making the leap from that to "my code can't harm people" is a bridge too far.
Meh. My application is an internal app for a large company. It's basically scheduling software for a part of our business process. To even start to hack it you'd first have to break into the corporate network, and in the end you'd have data you didn't care about. Hell, I'm not even sure the people who use it care.
Worst case, a subtle bug (and it would have to be subtle for my users to miss it) might cost my employer a few thousand bucks.
Again, meh. There are a whole lot of internal applications that fall into this category.
"Executing Software Engineers for Bugs"
Never have I been prouder to be a plain old programmer. Those software engineers, architects, systems analysts, and data scientists can keep the capital punishment for themselves.
This author's first sentence is: "There is an apocryphal tale I've heard many times about how, in Ancient Rome (or, in some tellings, Greece), the engineers responsible for the construction of an arch were required to stand underneath it as the final wooden supports were taken out."
This story is derived from Law #229 of the Code of Hammurabi (of Babylon). Babylon was not part of Greece, but it was very briefly part of the Roman empire.
" 229 If a builder build a house for some one, and does not construct it properly, and the house which he built fall in and kill its owner, then that builder shall be put to death. "
I'm not a big fan of professional bodies or mandatory qualifications. I think these tend towards rent seeking, with these bodies existing mainly to justify their own existence. In the case of software engineering, I'm especially concerned about academics with little real world experience using valid security and privacy issues as an excuse to force their own view of how software engineering should be done on everyone else.
That said, my employer has standards for security and privacy that go well beyond industry norms, so if I was working elsewhere maybe I would feel the need for better standards across the industry.
In my experience, software engineers tend to be conscientious. Caring about the big picture is a big part of open source, hacker and nerd culture. But knowledge is hard to come by. I learnt from the experts, but I doubt most engineers would be able to build a simple CRUD app form scratch without major security holes.
It would be nice to see some best practices around security and privacy emerge without forcing everyone to write Ada or Coq or completely change their approach to writing software.
We can write software like this.
The reason why we don't has been explained many times: It's too expensive, by many orders of magnitude.
OK,
I think we all can agree that programmers almost never begin a project with the intention of causing harm. All the examples cited involved situations where immense pressure applied by higher-up impel a programmer to knuckle-under and agree to such approaches.
The solution that is offered seems very specific to now - we take the fall guy who caved in to one sort of incentive and we put an opposite incentive on him to force a different behavior (btw without removing the first incentive). How fucked-up is that?
Obviously, the better, saner solution is removing the existing perverse incentives, giving software engineers more leverage in decisions, punishing high-ups if they don't give software engineers autonomy to make decisions. And heck, execute the CEOs when the bridges collapse. The bucks should stop with them, right?
Until it is the CEO..CTO..and vc that is standing under the bridge...nothing will change.
Those of you with a membership in ACM or IEEE, I'm interested in hearing your thoughts on why you do.
Those of you who don't, would you consider pledging yourself to a code of ethics if they met your requirements? What would those requirements be?
No! This is the equivalent of saying that kids shouldn't have chemistry sets or be able to play on jungle gyms. Saying that critical infrastructure needs to be "bug-free" is one thing, saying that dating site apps need to be is ridiculous by comparison.
And even in the case of bridges and the like, they do have a tendency to collapse to earthquakes, wars, and the like. Given enough time, most everything has bugs.
An old observation, but still relevant: We really have no liability in our profession, compared to other licensed professions. Why does software quality suck? Because our customers almost never sue us if the work we do does not meet expectation.
> Unlike our brothers and sisters in the other engineering disciplines, though, software engineering does not have the same history of mandatory formal education.
Was there a time when other engineering disciplines were not formally schooled? Did they develop into having formal education or were they born that way? Is computer programming moving in the same direction via organizations like the ACM?
Can we learn from other fields who have in fact experienced similar issues on some level? Do we need to reinvent the wheel of creating an effective culture that rewards all the values humans desire? What are those? Success, ability to express oneself, ... ?
There's no such thing as bug free code. So yes, I've deployed coffee I knew had bugs. And if I took twice as long, and spent the extra time checking, it would still have bugs...
Effective code of conduct requires stable environment, people knowing each other and so on. No way it can be implemented in profession that's booming at the rate programming is.
Whoever claims that he never saw his software deployed to production with bugs has never written anything of any complexity. Adding "... he knew about" to that statement softens it a little, but not too much.
Problem comparing software engineers to the civil engineers lays in the ability of the latter to constrain their users, e. g. stating the weight limit for the bridge. Try to do that with software and enforce it.
> If we were required to "stand under the arch", as it were, how many of us would have pushed back against things that our employers asked us to do?
We can't achieve this consistently with privately owned software. Software must be public and transparent to have consistent accountability.
In the case of Grindr, for example, I assume we don't know exactly who wrote the original code. And the company would probably protect that information.
I believe those who contribute to open source software can and do subscribe to the author's desired level of accountability specifically because they know everything they write is available for public scrutiny for eternity.
Regarding the various calls that it should really be the CEO who gets executed, I'd wager those ancient bridge builders probably were the "CEO" of the relevant bridge building operation. Specifically, they probably managed substantial funds obtained from some Royal and had plenty opportunity to trade-off between personal profit and structural integrity.
Calls to transfer the idea to today's lowly programmer drone would be akin to ancient bridge builder absconding with substantial extra profit after convincing his Royal that it really should be the bricklaying crew who should line up under the bridge.
Some Mexican drug gangs do execute engineers for fucking up. And they recruit by kidnapping family. Hardball capitalism, for sure :(
undefined
140ce? What the hell does that mean?