Adobe Adds Chinese AI Video Models to Creative Suite, Raising Questions About Commercial Safety in the Agentic Era

Adobe is betting that creative professionals will accept a mix of commercially safe first-party AI models and riskier third-party alternatives, as long as transparency tools make the distinction clear. The company announced the addition of Kling 3.0 and Kling 3.0 Omni, two video generation models developed by Kuaishou, a Chinese technology company, to its Firefly creative platform . This move comes alongside the launch of Adobe's new Firefly AI Assistant, an agentic tool that can autonomously select which AI model to use for a given creative task, raising important questions about how commercial safety guarantees work when an AI system makes those choices on behalf of users.

What Are Kling 3.0 Models and Why Did Adobe Choose Them?

Kling 3.0 is designed for fast, high-quality video production with smart storyboarding and audio-visual synchronization capabilities . The Omni variant adds professional controls for shot duration, camera angle, and character movement across multi-shot sequences, giving creators more granular control over video generation. These additions bring Firefly's roster of available AI models to more than 30, joining Google's Nano Banana 2 and Veo 3.1, Runway's Gen-4.5, Luma AI's Ray3.14, Black Forest Labs' FLUX.2[pro], and ElevenLabs' Multilingual v2 .

The expansion reflects a strategic shift in how Adobe approaches AI integration. Rather than relying solely on its own proprietary Firefly models, trained on licensed Adobe Stock imagery and public domain content, the company is deliberately offering customers access to external models with different commercial safety profiles. This choice becomes more complex when an agentic system decides which model to invoke without explicit user direction.

How Does Commercial Safety Work When an AI Agent Chooses the Model?

When asked whether Adobe had concerns about integrating a model from a Chinese technology company given current geopolitical tensions, Alexandru Costin, Vice President of AI and Innovation at Adobe, offered a direct response:

"We think choice is what we want to offer our customers," explained Costin.

Alexandru Costin, Vice President of AI and Innovation at Adobe
He distinguished between Adobe's own commercially safe, first-party Firefly models and third-party partner models, which carry different commercial safety profiles .

The nuance matters significantly in the agentic era. When the Firefly AI Assistant autonomously selects which model to use for a given task, the commercial safety guarantees may vary depending on which engine it invokes. Costin acknowledged this complexity:

"The agentic power, and the fact that the assistant has access to all of those models, means it could decide to use a model that carries different content credentials," he noted.

Alexandru Costin, Vice President of AI and Innovation at Adobe
Adobe's response to this transparency challenge relies on its Content Credentials system, a metadata-and-fingerprinting framework developed through the Content Authenticity Initiative .

Adobe offers commercial indemnity for its first-party Firefly models but applies different indemnity levels for third-party models, a distinction that enterprise buyers will need to carefully evaluate. The company's strategy assumes that transparency about how content was created will allow users to decide whether a particular piece of content is commercially safe for their use case.

Steps to Navigate Commercial Safety When Using the Firefly AI Assistant

  • Understand Model Tiers: Adobe's first-party Firefly models offer full commercial indemnity, while third-party models like Kling 3.0 carry different indemnity levels depending on the vendor and use case.
  • Check Content Credentials: Before using generated content in production work, review the Content Credentials metadata to understand which model created the asset and what commercial safety guarantees apply.
  • Match Use Case to Model: For ideation and non-production work, third-party models may be acceptable; for final production assets, prioritize models with stronger commercial indemnity coverage.
  • Review Subscription Requirements: Using the Firefly AI Assistant requires an active Adobe subscription that includes the relevant apps, and generative actions consume existing generative credits.

Why Does This Matter for Creative Professionals?

The integration of Kling 3.0 signals that Adobe is willing to embrace external AI models even when they come from companies in geopolitically sensitive regions. Costin explained the reasoning:

"For some use cases, like ideation, non-production use cases, we got requests from customers to support some external models. If I'm in ideation, I might be more flexible with commercial safety. When I go into production, I'd want to have a model that gives you more confidence," he stated.

Alexandru Costin, Vice President of AI and Innovation at Adobe

This tiered approach reflects a pragmatic reality: no single AI model excels at every creative task, and customers have expressed demand for diverse options. However, it also creates a new responsibility for users. When an agentic system makes autonomous decisions about which model to invoke, users must understand the commercial and legal implications of that choice. The burden of transparency falls on Adobe's Content Credentials system to make those implications visible.

The Firefly AI Assistant will enter public beta in the coming weeks, though Adobe did not specify an exact date . For enterprise customers and creative professionals working on high-stakes projects, the distinction between first-party and third-party model safety will likely become a key factor in adoption decisions. Adobe's bet is that transparency, combined with user choice, will be sufficient to manage the risks inherent in a multi-model, agentic creative environment.