The Sora Shutdown Exposes a Deeper Problem: Why AI Video Tools Can't Ignore Copyright

OpenAI's decision to shut down Sora, its consumer-facing video generation app, has unraveled a major licensing deal with Disney and revealed a critical gap in how the AI industry handles creator rights. The £1 billion agreement between Disney and OpenAI, which would have allowed Sora users to generate videos featuring over 200 Disney characters, collapsed when OpenAI decided to refocus on practical AI tools that solve real-world problems rather than consumer applications. The move signals deeper challenges facing the video generation industry: without clear copyright frameworks and fair compensation models, even the biggest tech companies struggle to build sustainable products.

Why Did OpenAI Really Abandon Sora?

The official reason for Sora's shutdown was strategic. OpenAI stated it wanted to focus on practical AI tools that would help solve real-world problems, moving away from consumer-facing applications. However, industry observers point to another factor: Sora had begun to lag behind competitors. Guy Gadney, CEO of Charismatic.AI, noted that Sora had "started to lag behind" its rivals, particularly Google's Veo3 model and the Seedance model developed by ByteDance, the Chinese owner of TikTok. The timing raises questions about whether technical limitations, market competition, or copyright concerns played a role in the decision.

Guy Gadney, CEO of Charismatic

What's particularly telling is that Sora was never publicly available in the United Kingdom. Benjamin Field, CEO of Deep Fusion Films, attributed this to potential copyright issues, citing OpenAI's "guarded" approach to sharing details about its training data. This suggests that copyright uncertainty may have constrained Sora's expansion from the start, limiting its market reach and commercial viability.

What's Happening With Copyright and AI Video Right Now?

The Sora collapse arrives at a critical moment for copyright policy in the AI era. In March 2026, the UK government announced a significant shift in its approach to AI training and copyrighted materials. The government no longer favors allowing AI developers to train on copyrighted content unless rights holders explicitly opt out. Instead, it now plans to require content licensing for AI training, reversing its earlier position after widespread opposition from creators and industry stakeholders.

The government acknowledged that licensing markets for AI training are "new and evolving" and said it does not want to interfere at this time. However, a report commissioned by the British Film Institute's Rapid Evidence Assessment and Data Review program found that there is currently "little licensing of third-party-owned screen content for AI training in the UK". This gap between policy intent and market reality creates uncertainty for companies like OpenAI trying to build legitimate, licensed products.

How Are Creators Responding to AI Video Tools?

Creators across the screen sector are demanding a formal framework for how AI tools should operate. In February 2026, associations representing creative professionals gathered evidence from over 10,000 creators and delivered a cross-sector report titled "Brave New World? Justice For Creators in the Age of Gen AI." The report called for adoption of a CLEAR Framework focused on five key principles:

  • Consent: Creators must explicitly agree before their work is used to train AI models
  • Licensing: Content owners should be able to negotiate terms for AI training access
  • Ethical Use: AI developers must follow responsible practices in model development and deployment
  • Accountability: Companies must be held responsible for misuse of creator content
  • Remuneration: Creators must receive fair compensation when their work is used

The report, commissioned by the Society of Authors, the Association of Illustrators, the Independent Society of Musicians, and the Association of Photographers, argues that creators are not anti-AI but need to be respected and compensated for providing their content. The framework is intended to serve as "the minimum standard for a functioning, fair and ethical creative economy."

What Happened When ByteDance's Seedance Generated Copyrighted Content?

The urgency of these copyright concerns became clear in February 2026 when ByteDance's Seedance model, the second version of the tool, was used to generate videos featuring copyrighted characters and real people without permission. One viral example showed a scene of actors Brad Pitt and Tom Cruise fighting, neither of whom had authorized the use of their likenesses. Both Disney and Paramount sent cease-and-desist letters to ByteDance, and SAG-AFTRA, the actors' union, accused the company of a "blatant" disregard for law and ethics, stating that "responsible AI development demands responsibility".

This incident illustrates why licensing deals like the Disney-OpenAI agreement matter. Without clear legal frameworks and compensation models, AI video tools can easily be misused to generate content that violates intellectual property rights and performer rights. The Seedance incident showed that even as some companies attempt to build licensed products, others may operate in legal gray areas, creating reputational and regulatory risks for the entire industry.

How Are Performers and Voice Actors Being Affected?

The copyright and consent issues extend beyond visual content to voice and likeness rights. In March 2026, a University of Reading report on AI Human Avatars highlighted a troubling case involving voice actor Gayanne Potter. A clone of her voice was used by rail company ScotRail without her knowledge or permission, sold by Swedish company ReadSpeaker, with whom Potter had previously worked to lend her voice for accessibility and e-learning software.

"I'm not anti-AI, I'm about consent," Potter stated, adding that the incident and surrounding publicity had impacted her income and livelihood.

Gayanne Potter, Voice Actor

Potter's case underscores a critical gap in UK law. The AI Human Avatars report found that UK law surrounding ownership of AI avatars is "fragmented" and the current framework is "unfit for purpose," leaving "UK businesses and services unable to access the growth opportunities of the technology without policy intervention". Performers need legal recourse if contracts previously signed are affected by changes in technology, but current protections are inadequate.

What Steps Should AI Video Companies Take to Build Trust?

As the industry grapples with these challenges, several practical approaches are emerging for how AI video developers can operate responsibly and sustainably:

  • Transparent Training Data Disclosure: Companies should clearly communicate which content was used to train their models and obtain explicit consent from rights holders, rather than operating with "guarded" approaches that fuel distrust
  • Licensing Agreements: Develop formal licensing deals with content creators and rights holders, similar to the Disney-OpenAI model, to ensure creators are compensated and have control over how their work is used
  • Performer Protection Contracts: Negotiate fair contracts that protect human likenesses and voice rights, with clear terms about how AI avatars can be used and what compensation performers receive
  • Compliance with Emerging Standards: Adopt frameworks like the CLEAR Framework proposed by creator associations to demonstrate commitment to consent, licensing, ethical use, accountability, and remuneration
  • Proactive Moderation: Implement safeguards to prevent misuse of copyrighted characters and real people's likenesses, rather than waiting for cease-and-desist letters from studios and unions

The collapse of the Sora deal and the Seedance incident suggest that companies attempting to build legitimate, licensed products may struggle to compete with those operating in legal gray areas. However, as regulatory frameworks tighten and creator advocacy grows stronger, the long-term advantage will likely belong to companies that prioritize transparency, consent, and fair compensation from the start.

The video generation industry is at an inflection point. OpenAI's decision to shut down Sora and refocus on practical tools suggests that consumer-facing AI video applications may not be viable without resolving copyright and licensing challenges first. For the industry to mature sustainably, companies will need to move beyond the assumption that they can train on copyrighted content without permission and instead build business models that respect creator rights and compensate them fairly.

" }