Protect Yourself from AI and the Perpetuity Trap
By Timothy Beck Werth on March 17, 2026
An emerging creator secures their first brand deal. The compensation covers their rent, but not enough to hire a lawyer to review the contract, leading them to sign it. Subsequently, they discover their face appearing in advertisements without their consent.
Attorney Michelle May O’Neil, an expert in NIL (name, image, likeness) matters, warns creators to scrutinize contracts closely. At SXSW, she shared insights on how creators might inadvertently relinquish their online identity.
Beyond contractual issues, AI poses another threat, creating clones and deepfakes that misuse creators’ likenesses and voices.
O’Neil cited the case of TikTok influencers Kat and Mike Stickler who battled over social media accounts amid a divorce, emphasizing that fairness isn’t guaranteed in legal contexts. Creators must understand and protect their brand identity, particularly in the fast-evolving tech landscape.
Please note, this article does not constitute legal advice. Consult your lawyer before making significant decisions.
It’s Not Identity Theft if You Give It Away: Avoid the Perpetuity Trap
When reviewing contracts with brands, creators should watch for specific language:
– Perpetual / in perpetuity
– All media now known or later developed
– Derivative works
– Sublicensable
– Irrevocable
A concerning clause might state: “Creator grants Brand a perpetual, irrevocable, sublicensable right to use Creator’s name, image, likeness, and voice in all media now known or later developed.”
Many young creators feel pressured to accept deals, believing refusal might limit opportunities. However, signing away perpetual rights can have lasting consequences.
When asked about recourse for signing such a contract, O’Neil stated bluntly, “If they sign it, they sign it.” Contractual obligations underscore the need for vigilance, as lawyers drafting these agreements understand their implications.
Add Language to Contracts to Protect Yourself
Creators can take precautionary measures when negotiating contracts, such as requesting a sunset clause to limit the duration of likeness usage. Avoid deleting troublesome clauses; instead, add exclusions to safeguard against AI-generated representations.
Technology Evolves Faster Than the Law
Deepfakes have become common, with bad actors using minimal inputs to generate AI clones for various purposes. Creators, lacking a union, cannot rely on collective bargaining for AI protections as major actors and writers can.
If a company like Simon & Schuster insists on rights to AI voice clones, declining the offer might be wise. Specific contract clauses providing exceptions for AI clones are advisable.
LLCs and Trademark Law
O’Neil highlights two additional risks: creators neglecting to establish legal entities like LLCs, and failing to trademark and defend their IP. Inadequate international protections further complicate these matters.
For instance, an AI deepfake featuring Tom Cruise persisted online despite his legal efforts, illustrating the struggle creators face in defending their likenesses globally.
Put the Family Instagram Account in the Pre-nup
In partnerships, discuss content rights and plan for separations like breakups or divorces. Agreements might include social media accounts in prenups.
AI is the Problem, Not the Solution
Despite AI’s potential, it is not a substitute for legal counsel. Relying on tools like ChatGPT for contractual advice is risky, as demonstrated by legal actions against OpenAI for unlicensed practice of law.
We’re in an AI-driven era with insufficient regulations protecting the creator economy. Creators must proactively secure their rights, as external safeguards are limited.
Some quotes in this article have been edited for clarity and grammar.
Disclosure: Ziff Davis, Mashable’s parent company, filed a lawsuit against OpenAI in April 2025, claiming copyright infringements.
