The morality of software
My brother (Dr. Nadeem Siddiqui) and I drove from Dallas to the Big Bend National Park in early November this year. It's the kind of long drive that encourages dialog, even between otherwise reticent siblings! During our conversation ? which was by turns wistful, cerebral, and comical ? one topic excited both our passions: the moral underpinnings of our respective professions.
My brother is a nephrologist. He has a successful practice and is well liked by his peers and patients. He has always been the studious one in the family: I remember how hard he worked on his physics problems because (unlike biology), physics didn't come naturally to him. He is undeniably smart, has worked hard for his success, and continues to learn and grow in his profession. I'm proud of his accomplishments as only a brother can be.
However ? and this is where we passionately disagreed during the long drive ? there is one aspect in which I believe he has it easier than me: the moral guidelines in his profession are much clearer and easier to follow than in mine.
Medicine is an ancient profession. Eons after the first faith healers and soothsayers whispered their words in patients' ears and used their earthy remedies; yet still hundreds of years before the germ theory was discovered; the Hippocratic Oath formed the basis of what was moral in the practice of medicine. Of course, law dictates what's legal and what's not; and modern medical malpractice laws can be just as arcane and abstruse as tax laws. However, I venture to say that during the vast majority of a doctor's career, her daily routine is uninterrupted by questions such as "what are the moral implications of ordering this test, prescribing this drug, or having this frank conversation with the patient's relatives?" The art and science of my brother's craft are difficult (and I'll happily admit, much more so than what I do for a living). However, the morals are easier.
Imagine the worst person in a society ? a child murderer, perhaps ? and then ask yourself what would a nephrologist be obliged to do if such a person presented at her hospital with renal failure. Of course, the answer is that she would treat this patient to the best of her abilities. The punishment of this criminal is a matter for the law to decide; no one with any moral decency would say that "a doctor shouldn't treat a child murderer with renal failure, and thereby inflict a painful and slow death". (This isn't a "No True Scotsman" fallacy: there are actual laws codifying this morality to ensure criminals get good healthcare.)
Compare this to what a software engineer has to face in her profession. Let's take the example of the Volkswagen "dieselgate" emissions cheating scandal. Let's also dispense with the layers of bureaucracy and subterfuge in this scandal: here's a logical diagram of the source code with the conditional logic for the "cheat":
In a nutshell, this logic determines how to use the "customer specific acoustic condition" which, when true, puts the engine control unit (ECU) in "testing compliant" mode. (Customer means VW in this case.)
Beyond the fact that this logical condition has nothing to do with the car's acoustics, ask yourself if the moral implications of working on this "optimization problem" would be apparent to you, if you were one of the engineers on the team.
I'm not absolving anyone of any guilt in this criminal matter. My point is: the moral line in the software business is much fuzzier, more context-sensitive, and much more obscure than in medicine.
I have to regularly think about the moral implications of the source code my teams and I write. I try to stay true to the principle of "first do no harm with the software you write". Yet I have occasionally failed due to the moral obscurantism inherent in my profession. One of the few regrets in my consulting career is from when I helped a gambling company in Ireland improve its online gambling portal. One could say this should've been obvious to me: gambling addiction in Ireland (and elsewhere) is a serious problem. However, before I joined the team, I didn't know if I'd be helping directly with their gambling business or with something more benign or even beneficial (e.g. their gambling addiction support system). One could still say "why work at a gambling company in any capacity?" Fair point, I admit; however, not as firmly and indisputably established as the Hippocratic oath.
Our profession is young and the software we write is often far removed from its societal impact: the distilling effects of abstraction reduce thorny human issues to user stories and optimization problems. In ensuring that we firstly do no harm, we have it harder than doctors. Therefore we must forever be vigilant and ask ourselves routinely: "what is the moral impact of this software I'm writing?"
Footnotes:
On the VW "dieselgate" scandal, there is sufficient detailed material describing how the "cheating" firmware was written. Felix Domke, in particular, has done a lot of work. Here are a whitepaper (of which he's a co-author) and his keynote at the Chaos Computer Club conference:
[1] https://cseweb.ucsd.edu/~klevchen/diesel-sp17.pdf
[2] https://media.ccc.de/v/32c3-7331-the_exhaust_emissions_scandal_dieselgate
David Bollier has stated that the dieselgate scandal affirms that open-source software is the solution to preventing software that endangers the public in the future. This opinion ? "crowd-sourcing morality" ? is an interesting corollary to the dictum "given enough eyeballs, all bugs are shallow". It certainly deserves its own article!
[3] https://www.bollier.org/blog/volkswagen-scandal-confirms-dangers-proprietary-code
Chief Executive Officer at Sensate
5 年Certainly a very relevant question, and each individual can ask themselves in general, about their work and about any action whether it aligns with out values. Software and other products have the ability to influence human behaviour, and we who invest, develop or sell these products should indeed ask that very relevant question. Thanks Saleem Siddiqui
Technology problem solver for multiple industries
6 年There are earlier and more vague examples. For example, a person at Xerox PARC wanted to correct a problem on a computer remotely (over the network). He wrote software to fix the issue. That was the first computer virus. Software does not need to be created with a malicious intent to be used maliciously.
Though provoking, thanks for sharing
Technologist for Climate, Education and Social Justice
6 年Thank you for writing this Saleem. To anyone interested in the topic of ethical decision making in organizations I always highly recommend this: https://www.coursera.org/learn/unethical-decision-making? There is so much more on ethics, ethical behaviour, organizations and the psychology of ethical decisions that should be common knowledge. As individuals we carry psychological frames and pressure elements (context, authority, time, role, wrong incentives). As groups and crowds we face other issues like bystander effects, tragedy of the commons, slowly shifting and hard to adapt institutional frames. In both cases awareness and shaping the context around decisions is one? important element. Code of conducts and oaths, while are still valuable and important, are in reality less useful that what people commonly believe.