When AI harms people: Tesla and Character.AI
Today Tesla auto-pilot and Character.AI are both being investigated for causing or contributing to human death. Both situations are tragic, and neither is simple. I want to share my thoughts and draw out some lessons that should be learned for any business looking to leverage AI in a product. I’d love to get a discussion going, particularly with folks who work in marketing to children and teens
Character.AI: You’ve likely heard about a 14-year-old Florida boy who used Character.AI, an online service, to create an AI girlfriend. He became increasingly enamored of her, withdrawing from his previous friends and interests and eventually shooting himself. Further details are here and here. His mother, a lawyer, is accusing Character.AI of responsibility for his death and has received support from some parts of the tech world including Kara Swisher.
The whole episode is tragic and any parent’s heart breaks for the mother. (I haven’t seen anything about a father.) The leaders of Character.AI do seem to have taken a cavalier attitude, allowing users as young as 13 in the US, 16 in Europe, with no parental controls
The tech-bro founders stated publicly that an always on, always compliant companion will be “super, super helpful to a lot of people who are lonely or depressed.”? Link here. If nothing else, it illustrates the idiocy of speaking about health issues when you have absolutely no expertise nor apparently any empathy.
Based on what I have read it seems to me the company is far down the list of who is at fault. ?As a mother I hate to blame a grieving mother, but the more I look into this the more I believe she bears primary responsibility for this tragedy.
1)?????? The young man’s change in behavior was troubling enough for her to arrange for him to see a therapist who agreed he was in distress. Yet the teen was able to get access to her husband’s .45 caliber handgun. Voluminous high quality research has found that guns – particularly handguns – in a home are a major contributor to suicide. If nothing else, access to a gun turns a plea for help into a fatality. ?~260 US teens die by gun suicide each year.
2)?????? Reports state that she was concerned he was absorbed “in his phone.” Character.AI costs $10 per month. Few 14-year-olds have their own credit cards. Therefore, it’s very likely that this charge was on her credit card bill month after month. If true, she had a window into what he was doing. In her defense, lots of teens get obsessed with fantasy and celebrity, but only with AI can the far away objects of fascination appear to interact with a troubled young man whenever he chooses.
Putting primary responsibility on the mother doesn’t let the tech company off the hook. It’s clear that in their zeal to get to market first they didn’t stop to ask experts about possible implications for fragile users.
·?????? Transcripts of the chats show that earlier when he mentioned self-harm the bot told him not to do it, but didn’t tell him to put down the phone and talk to an adult. The chat should not have continued. In the final encounter he said he was “coming home” to her, which she encouraged. Perhaps the LLM behind the bot did not register the euphemism for dying. Clearly an LLM is not qualified to counsel a troubled teen.
·?????? ?Similarly, the hours he spent on the site should have sent up a flag that this is not normal, but the site did not have any mechanism to notify a parent, though controls are reported to be coming.
·?????? Also coming is signage on the screen reminding users that the bot is not a person and to not take its advice. This is a step in the right direction, but the human tendency towards anthropomorphism is so strong that I doubt users will see if after the first few screens.
领英推荐
Net, the service is making changes after this tragedy, but I’m skeptical it can be safe especially for teens. I’m very interested in hearing from marketers who target children and teens ethically. What should the service do? Should it just be banned? ???
Beyond a site just for chat, many companies are going to want to use avatars as customer support agents and shopping assistants. While the desire and the technology to make them highly realistic will be there, I’d advise making them purposefully non-human and not flirtatious or sexy.
Tesla Full Self-Driving: In October the NTSB opened an investigation after 4 reported crashes of Tesla’s with FSD engaged. One was fatal.
Tesla is the car brand most associated with autonomous driving, but according to Consumer Reports as of October 2023 it is barely mid-pack in quality. Tesla’s system relies on cameras while the state of the art has moved to Lidar, radio signals that measure distance regardless of visibility. My guess is that they will get their lunch handed to them by the NTSB.
But the bigger problem is how the human interacts with the system.
In my first newsletter I discussed that AI is different from traditional software because it can learn and improve so long as it has a good feedback loop. This means that AI makers rush to get their models into widespread use long before they are optimized, because it is the use that optimizes them.
Tesla and the other makers are selling autonomous driving systems while telling drivers that they need to be ready to take over at any moment. Unfortunately, that’s not how human’s operate. From Consumer Reports: Pnina Gershon, a research scientist at MIT AgeLab and the MIT Center for Transportation & Logistics, points to data showing that drivers often develop overreliance on driving assistance systems after a relatively short period of use. The data also shows that distracted driving is more common when using driving automation systems. “Automation aims to free resources and, not surprisingly, drivers use these ‘freed-up’ resources to do other things that driving.”
Different car makers have different systems for reminding drivers to stay engaged. Some are better than others, Tesla’s is poor, but I am skeptical of the whole proposition. As my former colleague and expert on accident analysis Gautam Divekar once told me “You can drive the car or you can have the car drive itself. Cognitively you can’t do both.”
It's easy to say we shouldn’t be using auto-pilot and I’ll state here I have a Tesla but do not have auto-pilot for just these reasons. However, we shouldn’t set the bar for autonomous vehicles at perfection, because human drivers are far from perfect. Over 40,000 Americans die in auto accidents every year and only a handful include any AI. We need safer transportation. Families are fearful for their teen and elderly drivers, but outside of city centers living autonomously in the US means driving. We know of the crashes tied to autonomous driving but will never know of the sleepy and tipsy drivers who made it home safely because of it.
The obvious lesson learned from the Tesla case is that skimping on technology in critical situations is not ok, regardless of what it does to your costs. One big lawsuit will wipe out any savings. Beyond this the most complex system in a car is the human. We’re remarkable at how we process information. An experienced driver will barely notice scenery going by but immediately recognize a thing that is out of place. Unfortunately the out of place things seems to be the Tesla’s Achilles heel. The solution can’t be telling a person to sit quietly but be ready to act at a moments notice. Car companies have to get their training data some other way.
Communications Strategy | Target Audience Research | Brand Planning | New Products | Mentoring
4 个月Thanks, Lynn: interesting post. I have been looking at ADAS systems for a client and have been struck by Tesla's implicit claim of that its "full self-driving" system provides autonomous driving. Clearly it does not and it seems (to me) to be a dangerous over promise wrapped in a name that encourages Tesla drivers to risk their personal safety so that FSD can incrementally improve its guidance. Waymo has better tech and they are being much more cautious with the roll-out. If you want to follow this further, keep an eye on BYD and their alliance with Nvidia. The Chinese market is pushing this hard.