Washington, Feb 16 (IANS) Alarmed by a surge in impersonation fraud or deepfakes around the globe, the US Federal Trade Commission (FTC) has sought to modify a rule that would prohibit the impersonation of individuals.
The Lina Khan-led agency said in a statement that it is taking this action in light of public outcry about the harms caused to consumers and to impersonated individuals.
The proposed rule changes would extend protections of the new rule on government and business impersonation that is being finalised by the Commission.
“Fraudsters are using voice cloning and other AI tools to impersonate individuals with eerie precision and at scale,” Khan posted on X early on Friday.
“FTC proposes to expand its impersonation rule to cover impersonation of individuals, so these fraudsters would pay hefty penalties,” she added.
The Commission said it is also seeking comment on whether the revised rule should declare it unlawful for a firm, such as an AI platform that creates images, video, or text, “to provide goods or services that they know or have reason to know is being used to harm consumers through impersonation”.
“Our proposed expansions to the final impersonation rule would strengthen the FTC’s toolkit to address AI-enabled scams impersonating individuals,” Khan noted.
As scammers find new ways to defraud consumers, including through AI-generated deepfakes, this proposal will help the agency deter fraud and secure redress for harmed consumers, said the FTC in its statement.