OpenAI announced Sora will shift from an opt out to an opt in copyright model, giving rights holders granular permissions and new permission APIs. Brands gain more control and should adopt clearance workflows, provenance tracking, and content governance.

OpenAI announced on 2025-10-04 that Sora will switch from an opt in copyright model, giving rights holders granular control over how characters and other intellectual property are used in AI generated video. The change follows fast backlash after unauthorized uses of recognizable IP circulated, and aims to reduce legal and reputational risk for creators and businesses while improving overall content governance.
Sora launched amid strong interest in generative video and immediate concern about unauthorized use of famous characters, logos, and other copyrighted assets. Under the original opt out approach, rights holders had to find and remove unauthorized content after it appeared. That reactive model placed the burden on creators and brands and increased operational cost.
According to reporting, OpenAI will implement an opt in system that gives creators and brands three primary controls:
The update highlights a focus on brand safe AI video and on giving enterprises tools they can integrate, for example permission APIs and provenance tracking to verify where content and rights originated.
This move is a material shift in how platforms manage IP risk, but it does not eliminate the need for governance. Practical implications and next steps for teams using AI video in marketing, training, or customer engagement include:
Opt in reduces the risk of straightforward infringement on covered assets, but it does not resolve all legal questions. Edge cases remain, including transformative uses and fair use assessments that depend on jurisdiction and context, deepfake and reputation concerns that go beyond copyright, and third party content that is not covered by opt in controls. Human authorship and demonstrable creative input remain important factors in copyrightability under recent 2025 guidance.
OpenAI's shift to granular opt in copyright controls in Sora is a meaningful step toward safer generative video. It advances AI copyright controls, permission APIs, and provenance verification as practical tools for enterprises, but organizational processes must keep pace. For businesses the takeaway is clear: adopt content governance practices, implement clearance workflows, and require documented permissions to fully benefit from these platform level improvements as generative video governance evolves in 2025.



