Beware of Digital Spies in Our Midst
Richard Chambers
Senior Advisor, Risk and Audit - AuditBoard (5X Deloitte Fast 500 Company) | Executive Director - The Audit Trail Academy | Award-winning author and blogger
There’s a joke making the rounds in workplaces across America, and likely in boardrooms and bedrooms and across social media:
Wife to husband: “Why are you always whispering in the house?”
Husband: “Because I’m afraid the government is listening.”
Wife laughs. Husband laughs. Alexa laughs. Siri laughs.
As with many topics we joke about, the heart of the subject is not a laughing matter.
Digital voice assistants are ubiquitous these days, as billions of people around the world have — knowingly or not — traded in a bit of their privacy for convenience. According to U.K.-based Juniper Research, there were 2.5 billion digital voice assistants, such as Alexa, Siri, and Google Assistant, in of 2018. That’s expected to triple to 8 billion by 2023.
My first exposure to the power of these transformational tools occurred shortly after we added an Alexa in our house. We were all gathered around shouting commands to Alexa, to which she dutifully responded. Finally, my 3-year-old grandson shouted “Alexa — I want a Bumblebee.” We all laughed. A short time later, I received a message from Amazon that a Bumblebee toy was on its way!
Digital assistants rely on passive listening technology that triggers or “wakes” the device once a recognized command is spoken, such as “Hey Siri.” That means the device is always listening for those trigger phrases or commands. This raises a host of questions related to what these assistants may be hearing — and recording.
The issue is especially important for organizations, which should understand what risks are associated with these “always on” devices. For example, digital assistants may not be standard office equipment, but some employees have begun bringing them to work. There also is little doubt that employee mobile phones are passively listening all the time. Does Alexa or Siri in the workplace leave a company vulnerable to corporate espionage, cyberattacks, or even extortion? Could hackers design malware to surreptitiously engage and intercept a digital assistant and listen in on a corporate executive’s life?
While that may seem far-fetched, the point is that there is limited information available to make an assessment of the associated risks.
Companies that offer digital assistant services, including Apple, Google, and Amazon, are among the largest in the world. And they invest heavily in advanced technologies designed to improve the customer experience. But at what cost?
According to privacy advocate Consumer Watchdog, patent applications for an algorithm would allow future versions of Amazon’s Alexa to monitor conversations and target the speaker for advertising based on what was said. That raises significant ethical questions.
Earlier this year, Amazon provided a glimpse into its practices in response to questions from a member of the U.S. Senate. Amazon confirmed that its Alexa-enabled devices store user recordings indefinitely until customers choose to delete them. Amazon also explained it uses transcripts and recordings of customer conversations with Alexa to help improve the service’s voice-recognition capabilities. And it shares records of Alexa’s interactions with third-party service providers that may be contacted through Alexa, such as Uber or Domino’s Pizza.
Amazon did confirm that Alexa stops the stream of information it is collecting “immediately once the user ends the conversation or if Alexa detects silence or speech that isn’t intended for Alexa.”
“We use the customer data we collect to provide Alexa service and improve the customer experience, and our customers know that their personal information is safe with us,” according to the letter from Amazon’s vice president of public policy.
But in light of the number of breaches of other high-profile companies, such as Yahoo, Equifax, and Capital One, mere promises of safety are not reassuring.
It would be na?ve to believe that the makers of digital assistants are the only service providers collecting and leveraging customer data. An incident from 2012 provides an example of just how powerful such information can be.
A popular American retailer was widely criticized after a New York Times article exposed its practice of collecting and analyzing customer purchase histories to assign “pregnancy prediction” scores. The company’s research indicated expectant mothers were more likely to become loyal customers if they were hooked early in their pregnancies.
The article related just how accurate the pregnancy prediction scores proved to be. One father confronted a store manager demanding to know why his 16-year-old daughter was receiving coupons related to pregnancy products — only to learn later that the teen was, indeed, pregnant.
At that time, Siri was the only digital assistant on the market. Since then, Alexa (Amazon), Alice (Russia), AliGenie (China), Bixby (Samsung), Clova (Android, iOS), Cortana (Windows), Google Now, Google Assistant, and Mycroft (Linux) have joined the always-listening virtual assistant market.
Data-driven new technology is a permanent feature in the modern economy. All of us must remain informed and vigilant as to how these new tools and technology will impact our privacy, data protection, and organizational risk.
Partner JHS & Asso. LLP; Former Chief Advisor IIA India; Dir. Pyramid Cyber Security Inc.; Passionate Trainer
4 年The privacy issues around our personal and corporate lives are something we all need to be aware of.
Internal Audit, Risk & Compliance Partner - problems solved, solutions implemented
5 年I recently had an interesting conversation with a bank about Alexa, which was being used in multiple ways that Security and IA were unaware of. https://www.dhirubhai.net/posts/alasdaircia_privacy-cybersecurity-security-activity-6558839457863266304-YqLq
This is scary. Are these devices helping or hindering business operations?