Luma's New AI Agents Can Handle Entire Creative Campaigns in Days, Not Months

Luma AI has released Luma Agents, a new class of AI systems designed to handle complete creative projects from start to finish, working across text, images, video, and audio simultaneously. The agents are powered by Luma's Unified Intelligence family of models, which use a single multimodal reasoning system trained on audio, video, image, language, and spatial reasoning. Unlike traditional AI tools that require constant prompting and iteration, these agents can plan, generate, and refine creative work while coordinating with other AI models from companies like Google, ByteDance, and ElevenLabs .

What Makes Luma Agents Different From Other Creative AI Tools?

The key innovation behind Luma Agents is their ability to maintain persistent context across multiple assets, collaborators, and creative iterations. Rather than forcing users to learn how to prompt dozens of different models, the system generates large sets of variations and lets users steer the direction through natural conversation. This approach mirrors how human architects work: as they sketch, they're simultaneously visualizing light, spatial dynamics, and the lived experience of a space. Luma's Unified Intelligence operates on the same principle, understanding context while generating outputs .

The startup has already begun rolling out Luma Agents to major customers, including global advertising agencies Publicis Groupe and Serviceplan, as well as brands like Adidas, Mazda, and Saudi AI company Humain . The platform is now publicly available via API, though Luma plans to roll out access gradually to ensure reliable service and avoid workflow disruptions.

How Can Creative Teams Use Luma Agents in Their Workflows?

  • Campaign Ideation: Input a brief description and product image, and the system generates multiple location ideas, model suggestions, and color schemes for ad campaigns without requiring manual iteration on each element.
  • Multi-Market Localization: Convert a single campaign concept into localized versions for different countries, automatically adjusting cultural context and messaging while maintaining brand consistency.
  • Iterative Refinement: The agents evaluate their own outputs and refine them through self-critique, similar to how coding agents check their work until the solution is accurate and complete.
  • Cross-Asset Coordination: Manage text, images, video, and audio as a unified project rather than separate deliverables, maintaining visual and narrative consistency across all materials.

What Real-World Results Has Luma Achieved With These Agents?

In a demonstration, Luma's CEO showed how a 200-word brief and a single product image led the system to generate comprehensive campaign ideas. More impressively, the company completed a $15 million annual advertising campaign in just 40 hours for under $20,000, with the output passing the brand's internal quality controls and accuracy checks . This represents a dramatic compression of timeline and cost compared to traditional creative workflows.

"Our customers aren't buying the tool; they're redoing how business is done," said Amit Jain, chief executive officer and co-founder of Luma.

Amit Jain, CEO and Co-founder at Luma AI

Jain explained that the current workflow for using AI in creative environments falls short of what the industry expects. Instead of a seamless experience, it often feels like "here are 100 models, learn how to prompt them." Luma Agents eliminate this friction by handling the complexity behind the scenes .

Jain

How Does Unified Intelligence Enable This Capability?

The foundation of Luma Agents is the Uni-1 model, the first in Luma's Unified Intelligence family. Unlike traditional AI systems that are trained separately for different tasks, Uni-1 was trained on a single multimodal reasoning system that understands audio, video, image, language, and spatial reasoning together. This allows the model to "think in language and imagine and render in pixels or images," which Jain describes as "intelligence in pixels" .

The architecture enables agents to maintain context across creative iterations in ways that single-purpose models cannot. When a user adjusts a campaign direction, the system understands not just the new instruction but how it relates to previous decisions, brand guidelines, and the overall creative vision. Future releases will expand output capabilities to include additional audio and video features .

For ad agencies, design studios, and marketing teams, Luma Agents represent a fundamental shift in how creative work gets done. Rather than accelerating individual tasks, the platform compresses entire project timelines, reduces the need for specialized AI expertise, and maintains quality standards that pass internal review. As the creative industry continues to adopt AI, systems that can handle end-to-end workflows while preserving human creative direction may become essential infrastructure.