Copyright Implications Of AI-Generated Mental Health Narratives
I. Core Copyright Issues in AI-Generated Mental Health Narratives
Before examining case law, the major legal issues include:
Who is the author? (Human, AI developer, or user?)
Does the work meet the originality requirement?
Is the work copyrightable at all?
What if the AI output resembles existing personal memoirs or therapy narratives?
Could training data use infringe copyright?
Are there privacy and ethical overlaps (especially with mental health disclosures)?
II. Requirement of Human Authorship
1. Burrow-Giles Lithographic Co. v. Sarony
Background
This foundational U.S. Supreme Court case involved a photograph of Oscar Wilde. The defendant argued that photographs were mechanical reproductions and not eligible for copyright.
Legal Holding
The Court ruled that copyright protects works that are:
Original
Created by a human author
Reflecting intellectual conception
Importance for AI Mental Health Narratives
The Court emphasized that copyright protects “the fruits of intellectual labor” of a human.
If an AI independently generates a therapeutic narrative without meaningful human creative input, under this precedent:
The output likely lacks a human author.
It may not qualify for copyright protection.
This case forms the backbone of the “human authorship doctrine.”
2. Feist Publications, Inc. v. Rural Telephone Service Co.
Background
Rural Telephone published a white pages directory. Feist copied listings. Rural claimed copyright.
Legal Holding
The Supreme Court ruled:
Facts are not copyrightable.
A work must contain minimal creativity.
“Originality” requires independent creation plus creativity.
Relevance to AI Mental Health Narratives
AI-generated therapy narratives may:
Reproduce common emotional expressions.
Use standard narrative structures.
If a mental health story consists of generic statements like:
“I struggled, I felt lost, but I healed.”
Such content may fail the originality threshold.
However, if a human substantially edits and creatively structures the narrative, the originality requirement may be satisfied.
III. Explicit Rejection of Non-Human Authorship
3. Naruto v. Slater
Background
A monkey (Naruto) took selfies using a photographer’s camera. Animal rights groups argued the monkey owned the copyright.
Legal Holding
The Ninth Circuit held:
Animals cannot own copyright.
Copyright law presumes human authorship.
Importance for AI
Although involving an animal, courts and scholars use this case to reinforce that:
Non-humans cannot be authors.
Legal personhood is required.
If a generative AI writes a therapy narrative autonomously:
The AI cannot be the author.
No copyright exists unless a human exercised creative control.
This case strongly influences how AI-generated expressive works are treated.
IV. Direct AI Copyright Cases
4. Thaler v. Perlmutter
Background
Dr. Stephen Thaler attempted to register a work created by his AI system “Creativity Machine,” listing the AI as the sole author.
The U.S. Copyright Office refused registration.
Legal Holding
The federal court upheld the refusal, stating:
Copyright law requires human authorship.
The court clarified:
Machines cannot be authors.
Works generated without human creative input are not copyrightable.
Impact on Mental Health Narratives
If:
A user types a short prompt:
“Write a 2,000-word trauma recovery story.”
AI generates the entire text.
Then under this case:
The output is likely uncopyrightable.
No exclusive ownership exists.
However, if a therapist or writer:
Substantially edits,
Rearranges,
Rewrites,
Injects personal lived experience,
Then the human-authored portions may be protected.
V. Copyright Office Administrative Decisions
5. Zarya of the Dawn Copyright Registration Decision
Background
This involved a graphic novel where:
Text was written by a human.
Images were generated using Midjourney AI.
Decision
The U.S. Copyright Office held:
The human-written text was protected.
The AI-generated images were not protected.
Arrangement/selection by the human was protected.
Significance
Applied to mental health narratives:
If:
A person writes their therapy reflections,
Uses AI to expand paragraphs,
Curates, edits, and arranges the material,
Then:
The human-authored components are protected.
Pure AI-generated portions are not.
This creates a hybrid copyright model.
VI. Substantial Similarity & Derivative Works
AI systems trained on memoirs, blogs, and therapy literature raise infringement risks.
6. Anderson v. Stallone
Background
A writer created a script based on Rocky characters without permission.
Holding
The court ruled:
Unauthorized derivative works infringe.
Even creative additions do not excuse copying protected characters.
Relevance
If AI generates a mental health memoir:
Closely resembling a well-known published autobiography,
Reproducing structure, unique phrasing, or narrative arc,
There may be derivative work infringement.
7. Nichols v. Universal Pictures Corp.
Background
This case addressed whether copying character types and plot structures constitutes infringement.
Legal Principle
Judge Learned Hand established:
Copyright protects expression, not ideas.
General themes (e.g., “family conflict,” “redemption”) are not protected.
Application to AI Therapy Narratives
Common elements such as:
“Healing journey”
“Overcoming depression”
“Childhood trauma reflection”
Are unprotectable ideas.
But highly specific narrative expression is protected.
Thus, AI outputs risk infringement only if they replicate specific expressive details, not general therapeutic themes.
VII. Fair Use and AI Training
8. Authors Guild v. Google, Inc.
Background
Google scanned millions of books to create a searchable index.
Holding
The court ruled the scanning was fair use because:
It was transformative.
It did not replace the market for books.
Implications for AI
AI developers argue:
Training on books (including mental health memoirs) is transformative.
The output does not reproduce full works.
However:
If AI generates text too similar to a copyrighted memoir,
Fair use defenses weaken.
VIII. UK Approach to Computer-Generated Works
9. Nova Productions Ltd v Mazooma Games Ltd
Background
Concerned authorship of computer-generated video game images.
Holding
The court ruled:
The author is the person who made the “arrangements necessary” for creation.
Under UK law (CDPA 1988 Section 9(3)):
Computer-generated works can have authors.
The “arranger” may own copyright.
Implications
In the UK:
A therapist prompting AI extensively
Structuring detailed narrative flows
Directing stylistic output
May qualify as the author.
This differs from the stricter U.S. approach.
IX. Moral Rights and Sensitive Mental Health Content
In jurisdictions recognizing strong moral rights (e.g., UK, EU):
If AI alters:
Personal trauma accounts
Therapeutic disclosures
Issues of:
Attribution
Integrity of the work
Misrepresentation
May arise.
Even if copyright is unclear, ethical and reputational harms are significant.
X. Key Legal Conclusions
1. Pure AI-Generated Mental Health Narratives (U.S.)
Likely not copyrightable.
No human author = no protection.
2. Human-AI Collaborative Narratives
Human contributions are protected.
AI portions are not.
Arrangement and editing may be protected.
3. Risk of Infringement
High if output closely resembles specific memoirs.
Low if it uses generic therapeutic tropes.
4. UK Position
More flexible.
“Person making arrangements” may qualify as author.
XI. Broader Policy Concerns
Exploitation of personal trauma data in training sets
Ownership disputes between therapists and patients
Confidentiality risks
Commercialization of automated therapy storytelling
Absence of clear statutory reform
XII. Emerging Legal Trend
Courts consistently emphasize:
Human creativity remains central to copyright protection.
AI may be:
A tool (like a camera or word processor)
But not an author
Until legislatures amend copyright statutes, the dominant legal position remains human-centered.

comments