Model ID
Model IDby IMF
Guide

Your Image in the Age of AI

What Models Need to Know

Last updated: February 2026

AI is changing the modeling industry — from digital twins to deepfakes to your photos being scraped for AI training. This guide covers what legal protections exist right now, where the gaps are, and what you can do to protect yourself. Everything here is sourced from enacted legislation, official government guidance, and verified reporting.

Last updated: February 2026 · For educational purposes only, not legal advice

The short version:

Laws are catching up — New York, California, Tennessee, and the EU all now have laws addressing digital replicas and deepfakes. The strongest is the NY Fashion Workers Act, which requires separate written consent, specific terms, and compensation for any digital replica use. Federally, the TAKE IT DOWN Act criminalizes non-consensual intimate deepfakes. But major gaps remain: no U.S. federal right of publicity, AI training data is largely unregulated, and enforcement is limited. Your best protection right now is knowing your rights, reading contracts carefully, monitoring your digital presence, and documenting everything.

Digital Twins / Digital Replicas

A digital twin is an AI-generated version of you, created from your photographs and body scans. Once created, it can be posed, dressed, and placed in any setting — without you being present. H&M created digital twins of 30 models in 2025. Models retained ownership and were compensated per use.

AI-Generated Models (Fully Synthetic)

Entirely fictional people created by AI — like Shudu and Noonoouri, who have appeared in campaigns for Balmain, Dior, and Versace. These aren't based on a specific person, but when AI tools are trained on real people's images, the output can closely resemble real individuals.

Deepfakes

AI-manipulated media that makes you appear to say or do things you never did. An estimated 500,000 deepfakes were shared in 2023; that figure reached approximately 8 million in 2025. Resemble AI tracked 2,031 verified deepfake incidents in Q3 2025 alone. Models are particularly vulnerable because high-quality images of them are widely available online.

AI Training Data Scraping

AI image generators are trained on massive datasets of images scraped from the internet. The LAION-5B dataset — used to train Stable Diffusion and Google's Imagen — contains 5.85 billion images collected without anyone's consent. Your portfolio photos, social media images, and editorial work may already be in these datasets.

AI Body & Feature Alteration

AI tools can change your skin tone, body shape, age, or facial features in post-production — without a new photoshoot and potentially without your knowledge or consent.

Key distinction:

Authorized digital twins involve informed consent, specific terms, compensation, and model control. Unauthorized use includes scraping for AI training without consent, creating deepfakes, reusing altered images beyond the original contract, and creating replicas without written agreement.

New York Fashion Workers Act

In effect since June 19, 2025
  • Requires separate written consent for digital replicas — cannot be bundled into representation agreement
  • Consent must detail scope, purpose, pay rate, and duration
  • Each new use requires new consent — even if they already have the photos
  • Previous power of attorney agreements covering digital replicas were automatically invalidated
  • Penalties: $3,000 first violation, $5,000 subsequent. Private right of action. Damages up to 300% if willful
  • Enforced by NY Department of Labor

California AB 2602 & AB 1836

In effect since Jan 1, 2025
  • AB 2602: Digital replica contract terms unenforceable if they replace in-person work, lack specific description, and you weren't represented by counsel
  • AB 1836: Posthumous protection — estate consent required, 70 years, $10,000 minimum damages
  • Applies to performances fixed on or after Jan 1, 2025

TAKE IT DOWN Act

Federal, signed May 19, 2025
  • Criminalizes non-consensual intimate imagery including AI deepfakes
  • Up to 2 years imprisonment (3 for minors)
  • Platforms must remove within 48 hours
  • Platform compliance deadline: May 19, 2026
  • Enforced by FTC

NY AI Transparency Laws

Signed Dec 11, 2025
  • Mandatory disclosure of AI-generated performers in ads (effective June 9, 2026)
  • Posthumous right of publicity — estate consent required, 40-year limitation

Tennessee ELVIS Act

In effect since July 1, 2024
  • First state law covering AI voice replicas
  • Liability for tool providers creating unauthorized replicas

EU AI Act Article 50

Takes effect Aug 2, 2026
  • AI-generated content must be machine-readable marked
  • Deepfakes must be disclosed at first exposure
  • Code of Practice being finalized (final version expected June 2026)

Denmark Copyright Amendment

Expected March 31, 2026
  • Would grant copyright-like protection over face, voice, body
  • First EU law treating identity as intellectual property
  • 50-year posthumous protection
  • Netherlands proposed similar legislation

State Right of Publicity Laws

Varies by state
  • No federal right of publicity — varies by state
  • At least 4 states (Arkansas, Montana, Pennsylvania, Utah) enacted new digital replica laws in 2025
  • 45 states have sexually explicit deepfake laws
  • 47 states have some AI synthetic media law

NO FAKES Act

Pending — introduced April 2025
  • Would create federal digital replication right for all individuals
  • DMCA-style notice-and-takedown for unauthorized replicas
  • 70-year posthumous protection
  • Bipartisan support, backed by SAG-AFTRA and 400+ artists
  • Introduced in Senate (S.1367) and House (H.R. 2794), in committee
  • Criticism from EFF and CDT over free expression concerns

No federal right of publicity

Protection depends on which state you’re in. What’s illegal in NY may be unregulated elsewhere.

AI training data is largely unregulated in the U.S.

No law requires consent before using your public images for AI training. Lawsuits pending but no definitive ruling yet.

Geographic limitations

FWA only covers NY. California laws cover CA. International work creates cross-border enforcement gaps.

Detection is difficult

AI images are increasingly indistinguishable from real photos. Detection tools exist but lag behind latest AI.

Independent contractor status

Models can’t unionize under current U.S. labor law, unlike SAG-AFTRA actors who negotiated AI protections.

Enforcement capacity

Laws exist but enforcement agencies can’t match the scale of AI content creation. Pursuing claims requires time, money, and legal knowledge.

Before You Sign Anything

1

Read digital replica clauses carefully — any mention of "digital replica," "AI," "digital twin," "virtual likeness," or "synthetic content" requires attention

2

Insist on specificity — consent should detail exact uses, platforms, duration, and compensation rate

3

Get legal representation — under CA AB 2602, provisions may be unenforceable without counsel

4

Check body scan terms — understand exactly how scan/photo data will be used

5

Include revocation clauses — negotiate ability to withdraw consent

Monitor Your Digital Presence

1

Reverse image search regularly (Google Images, TinEye)

2

Check if your images are in AI training data — Have I Been Trained searches the LAION-5B dataset and lets you opt out

3

Opt out of AI training on platforms (Meta EU opt-out, LinkedIn, Microsoft Office settings)

4

Use AI detection tools (Sightengine, Illuminarty, Google SynthID) — imperfect but useful indicators

If You Discover Unauthorized AI Use

1

Document everything — screenshots with timestamps, URLs, metadata

2

Report to the platform — under TAKE IT DOWN Act, platforms must remove non-consensual content within 48 hours

3

File a complaint — NY DOL for FWA violations, FTC for intimate deepfakes, national DPA in EU for GDPR

4

Consult a lawyer — possible claims under right of publicity, FWA, Lanham Act, GDPR

5

Contact industry organizations — Model Alliance, Transparency Coalition, Spawning

Fashion Workers Act

In effect since June 19, 2025

New York State

Written consent required for digital replicas; scope, pay, duration specified

CA AB 2602

In effect since Jan 1, 2025

California

Digital replica contract terms unenforceable without specific consent and representation

CA AB 1836

In effect since Jan 1, 2025

California

Posthumous digital replica protection; estate consent; up to 70 years

TAKE IT DOWN Act

Signed May 19, 2025

Federal (U.S.)

Criminalizes non-consensual intimate deepfakes; 48-hour platform takedown

TN ELVIS Act

In effect since July 1, 2024

Tennessee

Right of publicity covers AI voice and likeness replicas

NY AI Transparency

Effective June 9, 2026

New York State

Mandatory disclosure of AI-generated performers in ads

EU AI Act Art. 50

Takes effect Aug 2, 2026

European Union

AI content must be labeled; deepfakes disclosed

Denmark Copyright

Expected March 31, 2026

Denmark

Copyright-like protection over face, voice, body

NO FAKES Act

Pending — in committee

Federal (U.S.)

Federal digital replica right; notice-and-takedown

1

Identity Verification

A verified digital identity confirms who you are — prerequisite for asserting rights over your likeness and filing complaints.

2

Digital Twin Authorization

Certified models can register proof of ownership over their digital twin. The public verification tool shows where brands can book the digital twin safely, with contact details managed by the model or their agency. Model ID proves ownership and provides the verified point of contact — booking terms are handled directly between the brand and the model or agency.

3

Reporting System

Report unauthorized use — from contract violations to deepfakes — creating documented records that support administrative complaints and legal claims.

4

Digital Ownership Clause

IMF’s position: any digital twin or AI-generated representation replicating a real model’s likeness is the sole property of that individual, including exclusive rights to consent, control, monetize, revoke, or restrict use.

These tools don't replace legal advice or legal action. They create the documentation, verification, and paper trail that makes it easier to exercise your rights when you need to.

Sources

NY Department of Labor — Fashion Workers Act guidance and FAQs

NY Senate Bill S9832 full text

California AB 2602 and AB 1836 full text

TAKE IT DOWN Act, S. 146

NO FAKES Act, S.1367 / H.R. 2794

European Commission — AI Act Article 50

EU AI Office — Code of Practice on Transparency (Dec 17, 2025)

European Parliament Think Tank — Danish copyright analysis

Debevoise & Plimpton — NY AI transparency laws analysis

Morgan Lewis — Fashion Workers Act analysis

Benesch — FWA digital replica requirements

Davis+Gilbert — Understanding the FWA

ArentFox Schiff — FWA labor protections

Manatt — California AI and digital replica laws

Skadden — California AI bills analysis

Cornell ILR School — AI, Fashion, and Creative Labor

Model Alliance — FWA resources

Business of Fashion / CNN / Inc. — H&M digital twins reporting

Spawning — Have I Been Trained documentation

Resemble AI — Q3 2025 Deepfake Incident Report

DeepStrike — Deepfake statistics 2025

Congressional Research Service — AI and Right of Publicity (LSB11052)

Download Full Guide (PDF)

This guide is for educational purposes only and does not constitute legal advice.

International Modeling Foundation · model-id.com