Summary:
-
SAG-AFTRA condemns Tilly Norwood, a synthetic “actor” created by AI, as a threat to human performers’ rights.
-
The union warns against the use of AI-generated content in entertainment, emphasizing the importance of human experience.
-
SAG-AFTRA’s battle against unauthorized AI use extends beyond Hollywood, advocating for consent and compensation laws nationwide.
When the announcement of Tilly Norwood hit the entertainment news cycle, it didn’t come with a résumé, a past role, or even a birth year. It came with code. Norwood isn’t a breakout star or a Hollywood actor on the rise. She’s not even real. She’s a fully synthetic “actor” created by artificial intelligence—and SAG-AFTRA is having none of it.
In a statement issued Monday, the union made its position unmistakably clear: “Tilly Norwood is not an actor. It’s a character generated by a computer program that was trained on the work of countless professional performers — without permission or compensation.”
View this post on Instagram
SAG-AFTRA went further, describing Norwood’s creation as a “problem” rather than a solution. “It has no life experience to draw from, no emotion,” the union wrote, “and from what we’ve seen, audiences aren’t interested in watching computer-generated content untethered from the human experience.”
This is not theoretical posturing. It’s a flashpoint in an increasingly urgent battle between labor and the growing influence of generative AI in entertainment. And it arrives just months after a hard-won deal in which SAG-AFTRA secured new guardrails around digital likeness, consent, and compensation—after the longest actors’ strike in Hollywood in four decades.
That deal, reached in 2023, marked the first time contract language explicitly protected performers from having their likeness replicated without permission. Under current agreements, producers must now give notice before scanning an actor’s face or body, and performers can negotiate how that data is used—and reused. A scanned stunt double can’t suddenly become a leading lady without a new contract and a new check.
What Tilly Norwood represents, at least in the union’s eyes, is an attempt to dodge those safeguards entirely. “To be clear,” the union said, “this character is not subject to collective bargaining, and we will not accept the replacement of human performers by synthetics.”
ADVERTISEMENT
The creators behind Norwood have reportedly begun shopping her around for representation, a move that suggests the character—or at least the company behind it—is seeking access to traditional industry pathways. SAG-AFTRA’s warning shot is as much for those agencies and producers as it is for the press: if you’re under contract with us, you can’t use synthetic performers without notice, negotiation, and consent.
Outside of Hollywood, the union has also gone to battle in statehouses and on Capitol Hill. It backed the NO FAKES Act, federal legislation that would make it illegal to use a person’s voice or likeness without their consent in AI-generated media. In California, it helped pass bills that require disclosure and permission when digital replicas are used—including posthumously.
The argument, across all these fronts, comes down to authorship. Who owns a face? A voice? A performance? And who gets paid when that performance is cloned?
SAG-AFTRA’s president Fran Drescher, who became a high-profile voice during last year’s strike, has previously called the use of AI without consent “a form of cultural theft.” That sentiment has gained traction among other unions too. The Writers Guild of America recently finalized a deal that bars studios from using AI-generated scripts to replace human writers, and the Directors Guild has flagged concerns over AI-generated storyboarding and previs work.
In a recent case involving Fortnite, SAG-AFTRA filed an unfair labor practice charge after learning that the game used an AI-generated voice meant to mimic Darth Vader, allegedly without consulting or compensating the original performer’s estate. That case is still unfolding—but the precedent it sets could be critical.
