CTA Welcomes FTC Efforts on Impersonation Fraud but Highlights Concerns Going Forward

CTA Welcomes FTC Efforts on Impersonation Fraud but Highlights Concerns Going Forward

March is Fraud Prevention Month. Nearly a year ago, the Consumer Technology Association (CTA) built a coalition of over 200 trade associations and professional organizations in the business events industry, who shared a desire for Federal Trade Commission (FTC) action to combat impersonation fraud. On February 15, 2024, our calls for action came to fruition when the FTC issued a final rule on impersonation fraud giving the agency additional tools to go after fraudsters in federal court.?

As CTA has highlighted for years, impersonation fraud is a serious challenge for thousands of businesses. It’s also a challenge for us personally. CTA has been a victim of impersonation fraud on numerous occasions, creating reputational risks both for our organization and the CTA members whose brands are exploited. Bad actors have sought to fraudulently pose as CTA representatives and vendors of our show CES, the most powerful tech event in the world. These scams have taken a variety of forms, with the most common being emails purporting to sell CES attendee lists, which CTA does not sell or otherwise make publicly available. most commonly emails purporting to sell CES attendee lists, which CTA does not sell or otherwise make publicly available. Fraudsters have also offered false discounted badges, created fraudulent websites offering hotel bookings for CES, and even attempted to fraudulently ‘sell’ CES exhibit space, victimizing startups seeking to showcase their products for a global audience.

While the news of the FTC’s final rule was welcome relief, the FTC subsequently issued a supplemental notice that raises concerns about new liability for our industry. In our comments on the original proceeding, CTA urged the FTC to limit the bounds of proposed “means and instrumentalities” liability to entities that “have knowledge or consciously avoid knowing that they are making representations being used to commit impersonation fraud.”

However, the FTC’s newly issued supplemental notice covers not only providing “representations” that could be misused, but also makes it unlawful “to provide goods or services with knowledge or reason to know that those goods or services will be used” in impersonations. Read broadly, CTA believes this could be construed to try to impose liability on any tech product or system that is misused for impersonation, if there is reason to know it could be used for fraud.

If this story sounds familiar, you’re right. CTA’s CEO Gary Shapiro has frequently pointed to the Supreme Court’s decision forty years ago, in which the Court rejected Hollywood’s efforts to have VCRs banned as illegal products over fears they could be misused. In that landmark case, the court sided with innovation, ensuring that consumer electronics manufacturers could go forward without the fear of liability. Whether the technology is a VCR, the internet or a new AI platform, CTA’s long-standing principle has been to regulate the conduct, not the tool. Hammers can be used to build houses and can also be used to break into cars. So, we criminalize car theft, not hammers.

Given the FTC’s own statement, which includes reference to seeking comment on “whether the revised rule should declare it unlawful for a firm, such as an AI platform that creates images, video, or text, to provide goods or services that they know or have reason to know is being used to harm consumers through impersonation,” this concern about extremely broad application is not farfetched.

Make no mistake, the FTC’s final rule on impersonation fraud has the potential to provide much needed relief for businesses, and provide relief and protection for American consumers, who lost more than $10 billion to fraud in 2023. Of that sum, more than $750 million was due to business imposters. But making the tech industry the FTC’s scapegoat is not the answer. Our industry stands ready to be a partner with the FTC in its efforts to combat fraudulent activity, and we look forward to providing detailed comments in response to the most recent inquiry by the April 30, 2024 deadline.

Artyom Isakhanyan

BDR at INFUSE | Digital Marketing Enthusiast | Crafting Content and Strategies to Amplify B2B Success

8 个月

I am a BA student at the American University of Armenia working on a project focused on AI voice impersonation detection. Would you be willing to share your thoughts and experiences with us? Your input will help refine our solution and ensure it meets the needs of organizations like yours. https://forms.gle/b4z6K4dxjj4UEDUu6 Our solution integrates cutting-edge AI technology with human expertise to provide reliable detection and protection against impersonation risks. Your insights are crucial in shaping our efforts to combat this threat effectively. We recognize the growing concern surrounding AI voice impersonation and its potential threats to various sectors, including law enforcement, legal, insurance, and high-profile individuals.

回复

要查看或添加评论,请登录