When AI Must Ask Permission: The MLK Ban, the Next Victims, and the Coming Compliance Coming Storm
Oct 18, 2025
OpenAI just hit pause.
It suspended Sora 2 depictions of Dr. Martin Luther King Jr. after AI-generated videos surfaced using his likeness in disrespectful ways. Families, estates, public opinion - all pushed it to act.
But the real question is deeper: who is next?
Every iconic face, every legacy, every public figure is now fair game unless you have explicit control. And that undermines Sora’s core appeal — the ability to “make reality” from your prompts. If you can’t guarantee what your AI can’t use, the appeal cracks.
We may be standing at the start of a major compliance and regulation era in generative video. And that will change the game for all players — OpenAI, its rivals, and the businesses that use these tools.
In this post I’ll unpack:
What the MLK ban reveals about platform limits
The chilling question of which faces will be next
The hole in Sora’s value proposition if “permission” becomes mandatory
Why we’re accelerating toward regulation, not ignoring it
How rivals like Seedream and Nano Banana may reposition
What you, as a small business or creator, need to think through now
MLK Ban: A Turning Point, Not a Fix
When OpenAI paused MLK deepfakes, it wasn’t just a PR move. It was the first real crack in Sora’s “anything goes” façade.
Sora 2’s promise hinged on unlimited creative freedom: input a prompt, get a video. But freedom without guardrails invites abuse. The videos of MLK doing offensive things lit up the backlash. And OpenAI rushed in with a new process: estates can now request opt-outs.
But that brings problems:
Reactive territory: Sora’s policy is reactive - it waits for outrages, then patches. That’s not governance, that’s Whac-A-Mole.
Ambiguity in “recently deceased” and “public figure”: Who qualifies? The policy isn’t precise.
Control without enforcement: Even if you opt out, how do you prevent leaked models or third parties reimplementing likenesses?
If Sora’s open system expects everyone to trust it with identity rights, pausing MLK is an admission that trust wasn’t earned.
Who’s Next? What Happens When Everyone Is At Risk
Once MLK, who checks the next names? And how many will scream, “That’s crossing the line”?
Imagine:
Elvis Presley suddenly revived to promote burgers
Audrey Hepburn in cosmetic ads
Nelson Mandela narrating meme videos
Every immortal image can be a commercial puppet. If Sora allows it, and no law stops it, those legacy owners will push back. Estates, families, fans - they will demand control.
Once those demands pile up, Sora faces structural liability: if users misuse likenesses, lawsuits will target the platform, not just the creator. That raises the cost of doing “free creativity.”
If the next domino is personal likeness law, then Sora’s foundational argument - “create anything” - becomes hollow.
The Appeal of Sora Will Be Reversed
Today, Sora’s appeal is: “You can make anyone do almost anything.” With that appeal comes also the latent risk: if you can do anything, you may violate rights without knowing.
Once permission and liability become first-class, the appeal flips: You must prove the right to create, not just create and defend.
For creative businesses, that changes workflow:
You’ll need pre-release vetting
You’ll need asset rights checks and treaties with estates
You’ll need insurance and legal layers
What was once “fast video from text” becomes “slow, verified video with compliance.”
Sora will either have to build those tools in or watch creators move to safer platforms.
Regulation Is Not Coming - It’s Arrived
This MLK moment is a signal.
Governments, IP courts, and public opinion are now testing boundaries. The Japanese government is already asking OpenAI to stop infringing on manga and anime heritage. We will see waves of legislative responses:
Postmortem publicity rights (extending rights after death)
Mandatory watermarking and provenance metadata
Fines or licenses for using public figures’ likenesses
Compulsory “AI content labels” or warnings
This is not a future fight. It’s beginning now, quietly, in estates, in lawsuits, behind the scenes.
If platforms don’t build compliance modes, they’ll be forced into them.
Competitors: Seedream, Nano Banana & the Alternative Paths
Sora doesn’t exist in isolation. Other image / video engines are doing the same work from different angles.
Take Seedream 4.0, the ByteDance tool competing with Google’s Nano Banana. It’s gaining reputation for high fidelity and batch control.
What happens if Seedream builds in permission architecture early, or ties likeness control to identity verification? That might give it a safer path for adoption in regulated spaces.
Nano Banana, integrated deeply with Google’s ecosystem, may offer compliance hooks via Google’s existing identity infrastructure.
In short, the winner won’t be the tool that’s the fastest or prettiest. It’ll be the one that makes permission safe and frictionless.
What You Must Build Into Your Strategy Now
If you’re using or exploring AI video tools, here are guardrails you need now — not later:
Likeness Permission Tracking
Always get written permission for any real person. For deceased persons, check estate agreements or stay away.Audit & Logging
Your AI system should log who requested what, from which prompt, at what time, and allow reversal.Provenance Labels & Watermarks
Embed metadata or visible watermarks. Let viewers know this was AI from the start.Compliance Workflow
Build human review in for any use of public figures. Don’t let models run unchecked.Insurance and Legal Buffer
As this field becomes riskier, carry liability coverage or contracts that shift risk to clients.Stay Platform-Aware
Monitor how the AI platforms you use respond to policy and court decisions. Be ready to pivot.Communicate Transparency to Your Audience
In your disclaimers, show your audience you respect identity, legacy, and truth.
Conclusion
The MLK pause is not just a moment. It’s a signal flare.
Sora’s wild beauty is beginning to clash with the structural reality that identity matters.
When permission becomes non-negotiable, the appeal flips.
Regulation will arrive; tools without guardrails will be stranded relics.
Competitors who bake in consent will steal adoption.
And creators and businesses who treat identity like property will survive - while those who treat it like data will burn.
At Intellisite.co, we don’t just adopt AI. We design it with respect, rights, and reputation baked in.