AI is changing the modeling industry — from digital twins to deepfakes to your photos being scraped for AI training. This guide covers what legal protections exist right now, where the gaps are, and what you can do to protect yourself. Everything here is sourced from enacted legislation, official government guidance, and verified reporting.
Last updated: February 2026 · For educational purposes only, not legal advice
Laws are catching up — New York, California, Tennessee, and the EU all now have laws addressing digital replicas and deepfakes. The strongest is the NY Fashion Workers Act, which requires separate written consent, specific terms, and compensation for any digital replica use. Federally, the TAKE IT DOWN Act criminalizes non-consensual intimate deepfakes. But major gaps remain: no U.S. federal right of publicity, AI training data is largely unregulated, and enforcement is limited. Your best protection right now is knowing your rights, reading contracts carefully, monitoring your digital presence, and documenting everything.
A digital twin is an AI-generated version of you, created from your photographs and body scans. Once created, it can be posed, dressed, and placed in any setting — without you being present. H&M created digital twins of 30 models in 2025. Models retained ownership and were compensated per use.
Entirely fictional people created by AI — like Shudu and Noonoouri, who have appeared in campaigns for Balmain, Dior, and Versace. These aren't based on a specific person, but when AI tools are trained on real people's images, the output can closely resemble real individuals.
AI-manipulated media that makes you appear to say or do things you never did. An estimated 500,000 deepfakes were shared in 2023; that figure reached approximately 8 million in 2025. Resemble AI tracked 2,031 verified deepfake incidents in Q3 2025 alone. Models are particularly vulnerable because high-quality images of them are widely available online.
AI image generators are trained on massive datasets of images scraped from the internet. The LAION-5B dataset — used to train Stable Diffusion and Google's Imagen — contains 5.85 billion images collected without anyone's consent. Your portfolio photos, social media images, and editorial work may already be in these datasets.
AI tools can change your skin tone, body shape, age, or facial features in post-production — without a new photoshoot and potentially without your knowledge or consent.
Key distinction:
Authorized digital twins involve informed consent, specific terms, compensation, and model control. Unauthorized use includes scraping for AI training without consent, creating deepfakes, reusing altered images beyond the original contract, and creating replicas without written agreement.
No federal right of publicity
Protection depends on which state you’re in. What’s illegal in NY may be unregulated elsewhere.
AI training data is largely unregulated in the U.S.
No law requires consent before using your public images for AI training. Lawsuits pending but no definitive ruling yet.
Geographic limitations
FWA only covers NY. California laws cover CA. International work creates cross-border enforcement gaps.
Detection is difficult
AI images are increasingly indistinguishable from real photos. Detection tools exist but lag behind latest AI.
Independent contractor status
Models can’t unionize under current U.S. labor law, unlike SAG-AFTRA actors who negotiated AI protections.
Enforcement capacity
Laws exist but enforcement agencies can’t match the scale of AI content creation. Pursuing claims requires time, money, and legal knowledge.
Read digital replica clauses carefully — any mention of "digital replica," "AI," "digital twin," "virtual likeness," or "synthetic content" requires attention
Insist on specificity — consent should detail exact uses, platforms, duration, and compensation rate
Get legal representation — under CA AB 2602, provisions may be unenforceable without counsel
Check body scan terms — understand exactly how scan/photo data will be used
Include revocation clauses — negotiate ability to withdraw consent
Reverse image search regularly (Google Images, TinEye)
Check if your images are in AI training data — Have I Been Trained searches the LAION-5B dataset and lets you opt out
Opt out of AI training on platforms (Meta EU opt-out, LinkedIn, Microsoft Office settings)
Use AI detection tools (Sightengine, Illuminarty, Google SynthID) — imperfect but useful indicators
Document everything — screenshots with timestamps, URLs, metadata
Report to the platform — under TAKE IT DOWN Act, platforms must remove non-consensual content within 48 hours
File a complaint — NY DOL for FWA violations, FTC for intimate deepfakes, national DPA in EU for GDPR
Consult a lawyer — possible claims under right of publicity, FWA, Lanham Act, GDPR
Contact industry organizations — Model Alliance, Transparency Coalition, Spawning
| Law | Jurisdiction | Key Protection | Status |
|---|---|---|---|
| Fashion Workers Act | New York State | Written consent required for digital replicas; scope, pay, duration specified | In effect since June 19, 2025 |
| CA AB 2602 | California | Digital replica contract terms unenforceable without specific consent and representation | In effect since Jan 1, 2025 |
| CA AB 1836 | California | Posthumous digital replica protection; estate consent; up to 70 years | In effect since Jan 1, 2025 |
| TAKE IT DOWN Act | Federal (U.S.) | Criminalizes non-consensual intimate deepfakes; 48-hour platform takedown | Signed May 19, 2025 |
| TN ELVIS Act | Tennessee | Right of publicity covers AI voice and likeness replicas | In effect since July 1, 2024 |
| NY AI Transparency | New York State | Mandatory disclosure of AI-generated performers in ads | Effective June 9, 2026 |
| EU AI Act Art. 50 | European Union | AI content must be labeled; deepfakes disclosed | Takes effect Aug 2, 2026 |
| Denmark Copyright | Denmark | Copyright-like protection over face, voice, body | Expected March 31, 2026 |
| NO FAKES Act | Federal (U.S.) | Federal digital replica right; notice-and-takedown | Pending — in committee |
Fashion Workers Act
In effect since June 19, 2025New York State
Written consent required for digital replicas; scope, pay, duration specified
CA AB 2602
In effect since Jan 1, 2025California
Digital replica contract terms unenforceable without specific consent and representation
CA AB 1836
In effect since Jan 1, 2025California
Posthumous digital replica protection; estate consent; up to 70 years
TAKE IT DOWN Act
Signed May 19, 2025Federal (U.S.)
Criminalizes non-consensual intimate deepfakes; 48-hour platform takedown
TN ELVIS Act
In effect since July 1, 2024Tennessee
Right of publicity covers AI voice and likeness replicas
NY AI Transparency
Effective June 9, 2026New York State
Mandatory disclosure of AI-generated performers in ads
EU AI Act Art. 50
Takes effect Aug 2, 2026European Union
AI content must be labeled; deepfakes disclosed
Denmark Copyright
Expected March 31, 2026Denmark
Copyright-like protection over face, voice, body
NO FAKES Act
Pending — in committeeFederal (U.S.)
Federal digital replica right; notice-and-takedown
A verified digital identity confirms who you are — prerequisite for asserting rights over your likeness and filing complaints.
Certified models can register proof of ownership over their digital twin. The public verification tool shows where brands can book the digital twin safely, with contact details managed by the model or their agency. Model ID proves ownership and provides the verified point of contact — booking terms are handled directly between the brand and the model or agency.
Report unauthorized use — from contract violations to deepfakes — creating documented records that support administrative complaints and legal claims.
IMF’s position: any digital twin or AI-generated representation replicating a real model’s likeness is the sole property of that individual, including exclusive rights to consent, control, monetize, revoke, or restrict use.
These tools don't replace legal advice or legal action. They create the documentation, verification, and paper trail that makes it easier to exercise your rights when you need to.
NY Department of Labor — Fashion Workers Act guidance and FAQs
NY Senate Bill S9832 full text
California AB 2602 and AB 1836 full text
TAKE IT DOWN Act, S. 146
NO FAKES Act, S.1367 / H.R. 2794
European Commission — AI Act Article 50
EU AI Office — Code of Practice on Transparency (Dec 17, 2025)
European Parliament Think Tank — Danish copyright analysis
Debevoise & Plimpton — NY AI transparency laws analysis
Morgan Lewis — Fashion Workers Act analysis
Benesch — FWA digital replica requirements
Davis+Gilbert — Understanding the FWA
ArentFox Schiff — FWA labor protections
Manatt — California AI and digital replica laws
Skadden — California AI bills analysis
Cornell ILR School — AI, Fashion, and Creative Labor
Model Alliance — FWA resources
Business of Fashion / CNN / Inc. — H&M digital twins reporting
Spawning — Have I Been Trained documentation
Resemble AI — Q3 2025 Deepfake Incident Report
DeepStrike — Deepfake statistics 2025
Congressional Research Service — AI and Right of Publicity (LSB11052)
This guide is for educational purposes only and does not constitute legal advice.
International Modeling Foundation · model-id.com