Lina Khan-led US FTC aims to modify rule to curb deepfakes

 

by IANS |

Washington, Feb 16 (IANS) Alarmed by a surge in impersonation fraud or deepfakes around the globe, the US Federal Trade Commission (FTC) has sought to modify a rule that would prohibit the impersonation of individuals.


The Lina Khan-led agency said in a statement that it is taking this action in light of public outcry about the harms caused to consumers and to impersonated individuals.


The proposed rule changes would extend protections of the new rule on government and business impersonation that is being finalised by the Commission.


“Fraudsters are using voice cloning and other AI tools to impersonate individuals with eerie precision and at scale,” Khan posted on X early on Friday.


“FTC proposes to expand its impersonation rule to cover impersonation of individuals, so these fraudsters would pay hefty penalties,” she added.


The Commission said it is also seeking comment on whether the revised rule should declare it unlawful for a firm, such as an AI platform that creates images, video, or text, “to provide goods or services that they know or have reason to know is being used to harm consumers through impersonation”.


“Our proposed expansions to the final impersonation rule would strengthen the FTC’s toolkit to address AI-enabled scams impersonating individuals,” Khan noted.


As scammers find new ways to defraud consumers, including through AI-generated deepfakes, this proposal will help the agency deter fraud and secure redress for harmed consumers, said the FTC in its statement.

Latest News
IANS Year Ender 2025: As Pakistan sank, its army chief rose in power Fri, Dec 26, 2025, 05:01 PM
CEC Gyanesh Kumar meets Vice President Radhakrishnan Fri, Dec 26, 2025, 04:59 PM
Disrupted sleep cycles linked to aggressive breast cancer: Study Fri, Dec 26, 2025, 04:39 PM
IANS Year Ender 2025: Anti-obesity drive, generic drugs to remain key focus in 2026 Fri, Dec 26, 2025, 04:38 PM
Govt releases new BIS Standard for incense sticks to boost consumer safety Fri, Dec 26, 2025, 04:36 PM