Synthetic Media, Voice Cloning, and the New Right of Publicity Risk Map for 2026
By Jay Kotzker
Synthetic media and AI-driven voice cloning have moved from novelty to mainstream production tools. In advertising, entertainment, gaming, and social platforms, digital replicas of talent—sometimes authorized, often not—are being created at unprecedented scale. The legal landscape has responded quickly, and 2025 marks a turning point in right of publicity, deepfake regulation, and content licensing.
For production companies, agencies, and talent, understanding this evolving framework is now essential to mitigating reputational, contractual, and regulatory risk.
A Fast-Shifting Statutory Landscape
- Tennessee’s ELVIS Act: The New National Reference Point
In 2024, Tennessee enacted the Ensuring Likeness, Voice, and Image Security (ELVIS) Act, the first state law to expressly extend right-of-publicity protections to AI-generated voice clones. The Act criminalizes unauthorized digital replication of a person’s voice and provides civil remedies for infringement—marking a significant expansion of traditional likeness protections.
The Act is already influencing contractual norms, particularly for performers, voice artists, and influencers whose digital likeness or vocal style may be used in derivative works or automated production pipelines.
- A Wave of State Deepfake Bills
Following the ELVIS Act, several states—including California, New York, Texas, and Illinois—have introduced or strengthened statutes targeting:
- Deepfake content used in political advertising
- Non-consensual sexual deepfakes
- Unauthorized digital replicas in entertainment or commercial settings
Many of these laws create private rights of action, enhanced penalties for intentional misuse, and platform obligations to remove harmful synthetic media.
- Growing Momentum for a Federal Framework
While no federal right of publicity exists, bipartisan proposals continue to circulate, seeking to regulate AI-generated replicas in national political advertising and protect talent from unauthorized digital exploitation. Even without formal enactment, these proposals are shaping industry expectations around notice, consent, and clear contractual allocation of rights.
Key Risks for Production Companies, Brands & Talent
- Unauthorized Replication & Misattribution
Voice cloning and lifelike synthetic imagery can blur lines between parody, homage, and infringement. Misattribution—intentional or accidental—can trigger:
- Right-of-publicity claims
- Lanham Act false endorsement allegations
- Defamation or moral-rights claims
- Contractual violations when exclusivity or brand alignment is at stake
- Legacy Contracts Not Designed for AI
Traditional talent, production, and licensing agreements often fail to address:
- Rights to create or prevent digital doubles
- Ownership or control of AI-generated performances
- Duration and geographic scope of synthetic media use
- Compensation structures tied to automated reproduction or downstream exploitation
This creates gaps that can expose studios, producers, and brands to legal challenge.
- Platform & Distributor Uncertainty
Streaming platforms, social networks, and advertising channels are rapidly updating their terms to restrict unauthorized synthetic media, impose disclosure requirements, and allocate takedown responsibilities. Contracting parties need to ensure downstream compliance to avoid content removal or account suspension.
Contract Implications: What Needs to Change in 2025
- Explicit Rights Regarding Synthetic Media
Modern agreements should specify whether the hiring party may:
- Create AI-generated versions of a performer’s likeness or voice
- Modify recorded performances using AI tools
- Use AI derivations beyond original contexts (e.g., dubbing, advertising, future seasons, localization)
Clarity around scope, duration, medium, and territory is critical.
- Informed Consent Requirements
Increasingly, industry standards and state statutes demand clear, informed, written consent for:
- Voice cloning
- Digital replica creation
- Use of a performer’s likeness in AI tools
- Training models on a performer’s voice or image
Consent should be structured as specific, revocable where required, and documented.
- Compensation & Residuals for Synthetic Use
Talent negotiations are evolving to include:
- Separate fees for creation of a digital double
- Royalty or residual frameworks tied to AI-generated performances
- Revenue-sharing when synthetic media is used across multiple projects or commercial campaigns
These structures help avoid disputes about exploitation beyond the original engagement.
- Controls on Training Data & AI Model Inputs
Contracts should address whether a performer’s likeness or voice can be used to:
- Train internal production models
- Train vendor or third-party AI systems
- Generate derivative content not directly controlled by the producer
Prohibitions or strict limitations are increasingly common in high-profile talent agreements.
- Disclosure & Labeling Obligations
With states imposing disclosure requirements for synthetic media in political and commercial contexts, agreements should:
- Allocate responsibility for labeling content
- Require compliance with platform policies
- Ensure that distributors and advertisers understand when synthetic media is in use
- Takedown, Enforcement & Allocation of Liability
Given the growing patchwork of deepfake laws, agreements should include:
- Procedures for monitoring misuse
- Rights to request removal from platforms
- Indemnities for unauthorized synthetic reproductions
- Cooperation obligations for responding to claims or enforcement actions
What This Means for the Industry
The acceleration of synthetic media has created both extraordinary creative potential and meaningful legal exposure. In 2026, talent and production stakeholders are moving toward:
- AI-specific addenda to talent and licensing agreements
- Stronger performer control over digital replicas
- Detailed usage restrictions to protect brands and reputations
- Lifecycle governance, including downstream enforcement and platform compliance
The industry is entering an era where right-of-publicity, AI ethics, and content licensing converge. Proactive contracting is now essential to avoiding disputes and safeguarding creative relationships.
Conclusion
As states adopt new synthetic-media and voice-cloning laws—and as courts begin confronting unauthorized digital replicas—the expectations around transparency, consent, and contractual clarity are rising quickly. Production companies, agencies, talent, and brands must ensure their agreements are updated for a world where any performance can be replicated, remixed, or redistributed with a few lines of code.
Holon Law Partners advises clients across entertainment, advertising, and emerging technology on navigating these fast-developing issues with a focus on compliance, creative integrity, and long-term risk management.
