Case Law On Prosecution Of Digital Impersonation And Identity Theft Via Ai
Case 1: People v. Golb (New York, U.S.)
Facts:
The defendant impersonated various scholars online by using pseudonymous blogs and email accounts, posting messages that purported to be from those scholars in order to criticise and ridicule them. The impersonation involved claiming to be someone else, sending emails in that person’s name, posting as that person. The content was deceptive and intended to damage the reputation of the purported persons and benefit the defendant.
Legal Issues:
Charged under New York Penal Law for identity theft, impersonation, unauthorized computer use, and aggravated harassment. Key statute: criminal impersonation in the second degree (which requires impersonating another and doing an act in the assumed character with intent to injure or defraud). The case tested how impersonation via digital media qualifies as “impersonation” under statute.
Outcome:
The New York Court of Appeals upheld part of the conviction. It found that the defendant did impersonate and acted with intent to defraud or injure the purported persons. The court rejected the defence of parody or satire, holding that the impersonation was meant to mislead readers into believing the messages came from the claimed authors.
Significance:
This case is a landmark for digital impersonation: it shows that pretending to be someone else online (via email/blogs) with intent to harm or benefit can be prosecuted under impersonation laws, even absent physical identity theft (financial theft). It laid groundwork for treating online identity misuse as a serious crime.
Case 2: Flores‑Figueroa v. United States (U.S. Supreme Court, 2009)
Facts:
The defendant used the Social Security number of another person in connection with aggravated identity theft. He filed tax returns using the number, thereby committing tax fraud/identity theft.
Legal Issues:
Under 18 U.S.C. § 1028A(a)(1), the “aggravated identity theft” statute requires that the person “knowingly transfers, possesses, or uses, without lawful authority, a means of identification of another person” in relation to a predicate offence. The legal question: does the defendant need to know that the identification number used belonged to an actual, real, person?
Outcome:
The Supreme Court held that yes — the defendant must know that the means of identification relates to another actual person (not just a fabricated identity). The conviction was affirmed where the evidence supported that knowledge.
Significance:
Though not AI‑specific, this case sets out the crucial element of “knowing use” of another’s identity. In digital impersonation prosecutions (including AI‑enabled ones), proving that the actor knew they impersonated a real person is vital. It underpins many identity‑theft prosecutions globally.
Case 3: Midler v. Ford Motor Co. (U.S. 9th Cir., 1988)
Facts:
The famous singer Bette Midler sued Ford for using a singer’s voice in a commercial that sounded like her. The voice impersonator mimicked Midler’s voice so closely that audiences believed it was her.
Legal Issues:
Though primarily a civil case, it asked whether a unique voice, as a distinctive personal attribute, is protected and whether impersonation of that voice without authorisation is actionable. The court treated the voice as part of Midler’s identity rights.
Outcome:
The 9th Circuit held that using Midler’s distinctive voice without consent infringed her identity rights. Ford was liable for the unauthorised impersonation.
Significance:
Again not criminal prosecution, and not AI‑driven, but highly relevant to digital identity and voice/likeness impersonation. It foreshadows how courts may treat AI‑clone voices or deepfakes as misappropriating identity attributes. This case is often cited in AI voice‐cloning debates.
Case 4: Indian Case – AI Voice Cloning and Personality Rights: Arijit Singh v. Codible Ventures LLP (Bombay High Court, 2024)
Facts:
Famous Indian singer Arijit Singh alleged that multiple defendants used AI technology to replicate his voice without his consent and commercially exploited that cloned voice for songs, advertisements, virtual events. The defendants included AI developers, event organisers, platform owners.
Legal Issues:
No specific statute for AI voice cloning existed; the court evaluated under personality rights, right of publicity, unfair commercial use of voice/likeness, and injunctive relief. The challenge: how to treat AI‑generated voice impersonation as a violation of identity and personal rights.
Outcome:
The Bombay High Court granted an interim injunction: stopped defendants from using his AI‑cloned voice, ordered removal of existing content, and held that using his voice commercialised without consent is a breach of personality rights. The court recognised that voice is a distinctive personal attribute deserving legal protection.
Significance:
One of the first Indian high court decisions addressing AI‑based impersonation of voice. While not criminal prosecution (it was civil/injunctive), it provides precedent for future criminal or regulatory action. It shows identity theft/impersonation is increasingly being recognised in the AI era.
Case 5: Indian IT Act Prosecution – Digital Impersonation & Personation: Example under Information Technology Act, 2000 (India)
Facts:
Under Sections 66C and 66D of the IT Act: Section 66C punishes fraudulent or dishonest use of electronic signatures, passwords or unique identification; Section 66D penalises cheating by personation by means of computer resource. Digital impersonation via fake accounts, emails, social media is prosecuted. Courts have upheld convictions under these provisions.
Legal Issues:
Key issues: whether the impersonator used the electronic identity (password, email) of another person, or pretended to be another person in digital transactions; whether there was intent to cheat or defraud or cause harm. Also, whether the victim’s identity was used without consent.
Outcome:
In multiple case judgments, courts held that even temporary unauthorized use of another’s identity or digital credentials, if causing deception, qualifies under the IT Act punishments: up to 3 years imprisonment and/or fine. The statute is cognizable, allowing police investigation.
Significance:
This legal regime gives a statutory basis in India for prosecuting digital impersonation/identity theft. In AI contexts (deepfake voice/image, cloned accounts), these sections can be invoked even though they were drafted before AI era. It shows adaptation of laws to new impersonation tactics.
Case 6: Civil/Criminal Hybrid – Deepfake Videos of a Public Person (Delhi High Court Example)
Facts:
A public figure (e.g., entrepreneur) found AI‑generated videos falsely showing him endorsing investment schemes; the videos used his image, voice (or a cloned voice) and were widely circulated on WhatsApp/Instagram. Although there was no publicised criminal prosecution case (as of yet), the court held the impersonation and false representation caused reputational harm and potential financial victimisation.
Legal Issues:
Impersonation of a person’s identity via AI‑manipulated media (voice, face, image) and distributing that impersonation in a way that deceives viewers into thinking the person endorses something. The primary relief sought: injunction, takedown, and damages. The case signals potential criminal avenues (fraud, identity theft) in the background.
Outcome:
The Court granted interim relief, restrained the parties and platforms from further distribution, directed takedown of content. The implication: deepfake impersonation is actionable.
Significance:
Though not yet a full criminal prosecution (at least publicly reported with detailed case law), this case demonstrates that courts are already recognising AI‑generated impersonation (voice + image) as a meaningful legal injury. It will likely form part of future criminal identity‑theft/impersonation prosecutions.
Analysis & Key Observations
Evolution of Impersonation
Traditional impersonation involved identity theft (SSN, bank account, credentials). Exemplified by Flores‑Figueroa (Case 2).
Digital impersonation expands this: using email/social media accounts, passwords, digital likenesses (voice, face) and AI‐generated clones.
Courts are adapting: Midler (Case 3) shows voice impersonation rights; Arijit Singh (Case 4) shows AI voice cloning recognized; IT Act cases (Case 5) show statutory basis for digital impersonation.
AI and Deepfake Impersonation
Cases involving AI voice/face cloning, deepfakes, identity impersonation are emerging. While full criminal records are fewer, civil precedents show courts are ready to treat these impersonations as serious.
Future prosecutions will likely combine identity theft/impersonation statutes with fraud, e‑commerce deception, defamation and personality‑rights laws.
Proof of Knowledge/Intent
Key element: perpetrator must know they are impersonating another real person and intend to deceive/harm (see Flores‑Figueroa).
Digital impersonation cases must show use of someone’s identity or likeness without lawful authority, and resulting benefit/harm.
Statutory Frameworks
Many regions use identity theft statutes (U.S.), impersonation laws (state penal codes), and cyber statutes (India’s IT Act).
AI‐specific statutes are limited; for example, Tennessee’s ELVIS Act addresses AI voice/ likeness impersonation. But prosecuting under existing laws is possible.
Evidence & Forensics
Digital evidence is central: metadata of cloned voice/image, logs of account usage, registration of clone domains/accounts, transaction records if impersonation leads to fraud.
Voice/face clone technologies may leave traces of AI generation; forensic audio/visual analysis can help.
Remedies & Penalties
Criminal prosecutions may carry imprisonment and fines (identity theft statutes, impersonation laws).
Civil remedies include injunctions, damages for personality rights, defamation, unfair commercial use of identity.
Platforms may be directed to takedown/cloned media.
Challenges Ahead
Attribution: tracing AI‑generated impersonation to specific individuals.
Jurisdiction: online impersonation often crosses borders.
Rapid tech advance: voice/face clones get easier; laws may lag.
Distinguishing between permissible impersonation (parody, satire) and unlawful identity theft/impersonation.
Conclusion
Digital impersonation and identity theft via AI or related technology is a rapidly evolving area. The cases above illustrate how courts are adapting existing laws (identity theft, impersonation, personality rights, cyber‑fraud statutes) to address new forms of impersonation: voice/face cloning, social media account takeover, AI‐generated deepfakes.
Though purely AI‐specific criminal case law is still limited, the foundation is being laid both in criminal and civil domains. Effective prosecution strategies will rely on proving knowledge/intent, linking to identity‐theft or deception frameworks, using strong digital forensics, and leveraging emerging personality‐rights jurisprudence.

comments