Engineers should start take more responsibility for immoral tech
The morality of tech is a knotty, technical topic. Who bears responsibility for say, when Samsung phones began exploding? Or when bugs in Tesla’s dashboard interface became unusably buggy? Usually, the answer will be everyone and no one.
Clever, professional teams can collectively allow really stupid mistakes to happen, because the problem somehow finds a way to slip through the gaps.
It’s the same dynamic that results in metal instruments being left inside people during surgery.
Sometimes, it’s intentional.
Now, that’s not to say that issues in tech are always the result of professional negligence. As we all well know, nefarious intentions are often the problem; the Cambridge Analytica scandal shone a light on how tech could be horribly misused.
Even jokes about ‘getting that notification dopamine hit’ are commonplace nowadays. People seem increasingly aware that Big Tech has some grasp on our collective reality in a way it didn’t 5, 6 or 7 years ago. And most of us know that if we aren’t paying for something, we’re probably the product.
Where does the blame lie?
Where blame lies for these issues is - as mentioned - difficult to pin down. In lots of instances, users are told it’s their fault for being apathetic about privacy; that they’re too willing to tick whichever Terms and Conditions box they’re offered.
I’ve literally written parts of Terms and Conditions in the past, and I can tell you - not even I read the whole thing. But in the gauntlet of life it’s not fair to expect busy people to be 100% on top of every online decision they make. The responsibility for making sure privacy is protected as a default are the tech companies themselves. And if they can’t step up to the plate, then government regulation needs to.
领英推荐
Take Download Festival, for example. They were running facial recognition in a bid to catch criminals. Every face that entered the festival was scanned and logged, without informed consent. Complaints followed from those that rightly felt their privacy had been invaded, but many simply didn’t realise the problem.
The only way to limit this kind of thing is the law (unless you intend on taking a baseball bat to every camera...)
What can be done?
Knowing you’re the product helps, as mentioned. If we all did, then more complaints might have been lodged at Download.
But I also think engineers need to take responsibility.
Which is why it was great to see Google’s decision to withdraw from the Pentagon’s Project Maven in March of last year. Thousands of Google workers signed an open letter to CEO Sundar Pichai criticising the contract to help develop military facial recognition. The pressure they applied worked!
But this isn’t the same for smaller companies, where you only have a 3-4 person team of Devs. Change doesn’t have to come in the form of an open letter. It doesn’t have to be anyone’s “fault”. Something you can do, though, is open a conversation about it if you do have any concerns over the morality of what they are developing.
Indeed, as the people who create the software, they are the gatekeepers to new tech.
??Scaling Robotics teams across the UK & Europe ?? | ? Connecting exceptionally talented Autonomy and Robotics Specialists with innovative teams ? | Mental Health First Aider |
3 年A really interesting read Matt! I can definitely agree with you that both decision makers and engineers need to stand their ground as much as possible when the technology trying to be imposed is morally or functionally wrong
Technical Customer Success Manager | Product Owner | Coder| Lover of cycling and gym ??♂?| 35 years of programming | Seeking Work | Dad |??
3 年I've not been that position luckily that I know of partly because I stay away from sectors that could be in areas of questionable morality (gambling, spying, weapons, phone hacking et al).
AEM Architect
3 年At the end of the day, the psychopaths in charge can't do a thing without us. I believe quite strongly that engineers should be held responsible for the misuse of technology they create.