Is Empathy the cure to the evils of AI?
Daniel Murray
Transforming Business Culture with Empathy | Keynote Speaker, Empathy Expert & CEO at Empathic Consulting
Welcome to edition 76, thanks for being a subscriber to my newsletter. Down below, I am giving you a sneak peek at the working cover for my new book due out through Wiley later this year... love to get your feedback please.
This week, for my wonderful readers, I also have some special offers! I've put them up top for those who want to access quickly with my usual musing below.
The piece this week explores the responsibility that will sit with everyone as we look to navigate through a world of AI driven outcomes.
Click the image to access special offers! https://empathicconsulting.com/bonus-offer
Those dogs were going to hurt somebody one day
In 2001, Dianne Whipple had just picked up groceries and was heading home to her apartment in Pacific Heights, San Francisco. At the same time her neighbour, Marjorie Knoller was leaving her apartment to take her dog, Bane for a walk. Bane, an imposing animal. He was a 60kg Presa Canario, a breed known for their large build, confident temperament and protective nature.?
As Whipple approached her door, Bane lunged at her, knocking her to the ground. Knoller, unable to control the hulking animal, watched in horror as Bane mauled the defenceless Whipple. Her desperate screams caught the attention of Knoller’s other dog, Hera. Also a large Presa Canario, she pushed her way out of Knoller’s apartment door and joined Bane in the attack.
An autopsy found the dogs inflicted 77 bite wounds on Whipple including severing her jugular. Whipple was rushed to hospital but the blood loss and trauma was too great. Knoller was charged with second-degree murder, involuntary manslaughter and owning a mischievous animal that caused death.?
In an interesting twist, Majorie Knoller was a lawyer as was her husband, Robert Noel who was charged with involuntary manslaughter and owning a mischievous animal that caused death. Knoller and Noel, in their defence, insisted that the attack was unexpected, accidental and that Whipple may have even provoked Bane in some way. Knoller also insisted that she was simply unable to stop Bane and Hera in the moment of the attack.
This was a tragic, complex and controversial incident. A first trial found Knoller guilty of all charges and sentenced her to 15 years with her husband Noel receiving 4 years in prison. Two years later, an appeal successfully reasoned that while Knoller had acted recklessly, there was no proof she intended to cause Whipple's death or knew the attack would be fatal. Seeing her sentence reduced to just 4 years. However, the California Supreme Court reviewed the case and found this second judgement to be incorrect.?
The Supreme Court found Knoller had been warned of the danger her dogs posed to others, knew they had attacked people before and subsequently failed to take basic precautions to mitigate the risks her dogs posed. Moreover, her lack of remorse for the victim and her claims that it was Whipple who was partially responsible for not closing her door that was taken into account when reinstating her 15 year prison sentence.?
In the future of AI, this incident might be an incredibly important case. While it might be more challenging, could you argue that YouTube’s algorithm was partially responsible for some of the terrible human behaviour that was attributed to it by the perpetrators? What about when the actions of the algorithm are much more direct?
Just because we don’t know why an algorithm took a certain action, does that mean no human is responsible for the damage it causes? It has echoes of the US National Rifle Association slogan: Guns don’t kill people, people kill people. Will this be the same when AI is doing the killing? While there is increasing call for regulation and control of AI, its proliferation into every area of our lives is unlikely to slow down. The genie is out of the bottle and it’s unlikely anyone can put it back in. We can’t stop the AI, but we must better understand ourselves first.
This is why we so desperately need more empathy right now. Having a greater understanding of the people around us, the impacts our actions could have on others and whether these impacts are ethical are all still human accountabilities. We must carefully think about the systems and scenarios we deploy AI within. We need to understand more, and not just the technical aspects, but also the ethical impacts. Unlike our friend Mickey, there might not be a Sorcerer coming down to save the day. I suspect we will have to live with the outcomes that our AI tools deliver and hope that, unlike the broomstick, we will be able to switch it off when it does go awry.
It is my hope that empathy will provide us with some buffer to this potential danger. While we are typically good at anticipating the positive impacts that our efforts might have on people like us, the better we can get at anticipating the impacts these actions might have on people very different to ourselves will determine the damage we cause.?
At subsequent parole hearings, Knoller has expressed regret and remorse for her failure to prevent Whipple’s death. However, this is little comfort for Whipple’s family and friends. The world of AI will not only present new dangers, but will accelerate their impact in ways we’ve never seen before. Today is the day we all need to be more human. Today is the day we need to commit to having more empathy, not just for those on our side, but for all people. While we heavily invest in developing artificial intelligence, we need to also invest in being empathically human.
If you like this article, you will love my new book due for release later this year. Here is a sneak preview of the working cover... let me know what you think!
With Empathy,
Daniel
?? Brand - strategy, design & digital ?? Director & Connector ?? Impact & Purpose-led ?? Actively seeking to partner with B2B businesses and individuals who operate on philanthropy and transparency
3 小时前Congratulations on the new book - the cover looks great!