Artificial Intelligence

How an AI Actor Is Reframing Hollywood’s Debate Over Artificial Intelligence

AI actor Tilly Norwood releases a musical video arguing that artificial intelligence can expand creativity in film

Updated

March 13, 2026 2:18 PM

AI Actor Tilly Norwood. PHOTO: INSTAGRAM@TILLYNORWOOD

As Hollywood prepares for this weekend’s Oscars, a different kind of performer is stepping into the spotlight — one that doesn’t physically exist.

Tilly Norwood, described as the world’s first AI actor, has released her debut musical comedy video, Take the Lead. The project arrives at a moment when artificial intelligence has become one of the most contentious topics in the film industry.

The message of the song is simple. AI should not be seen as a threat to actors. Instead, it can become another creative tool. The release also offers a first look at what Norwood’s creators call the “Tillyverse”. It is envisioned as a cloud-based entertainment world where AI characters can live, interact and perform.

Behind the character is actor and producer Eline van der Velden. She is the CEO of production company Particle6 and AI talent studio Xicoia. Van der Velden created Tilly as a way to experiment with how artificial intelligence could be used in storytelling.

The timing is not accidental. The entertainment industry has spent the past few years debating the role AI should play in filmmaking and acting. Questions about digital replicas, automated performances and creative ownership continue to divide artists and studios.

Norwood’s musical video enters that debate with a different tone. Instead of warning about AI replacing actors, the project suggests that the technology could expand what performers are able to do.

The video itself also serves as a technical experiment. The song Take the Lead was generated using the AI music platform Suno. The video was then produced using a combination of widely available AI tools and Particle6’s own creative process.

One of the newer techniques used in the project is performance capture. Van der Velden physically acted out Tilly’s movements and expressions so the digital character could mirror a human performance. But the production was far from automated. According to Particle6, a team of 18 people worked on the video. The group included a director, editor, production designer, costume designer, comedy writer and creative technologist. In other words, the project still relied heavily on human creativity.

“Tilly has always been a vehicle to test the creative capabilities and boundaries of AI,” van der Velden said. “It’s not about taking anyone’s job”. She added that even with powerful tools, good AI content still takes time, taste and creative direction.

The project also reflects how quickly production technology is evolving. Tools that once required large studios are now accessible to smaller creative teams experimenting with AI-driven storytelling.

For Particle6, the character of Tilly Norwood acts as a testing ground. Each project explores how AI performers might be developed, directed and integrated into entertainment. Whether audiences embrace digital actors remains an open question. Many in the industry are still wary of how AI could reshape creative work.

But projects like Take the Lead show another possibility. Instead of replacing performers, artificial intelligence could become part of the creative process itself. In that sense, Tilly Norwood may represent something more than a virtual performer. She is also an experiment in how humans and machines might collaborate in the future of entertainment.

Keep Reading

Artificial Intelligence

Cognizant Expands Google Cloud Partnership to Scale Enterprise AI Deployment

The IT services firm strengthens its collaboration with Google Cloud to help enterprises move AI from pilot projects to production systems

Updated

February 18, 2026 8:11 PM

Google Cloud building. PHOTO: ADOBE STOCK

Enterprise interest in AI has moved quickly from experimentation to execution. Many organizations have tested generative tools, but turning those tools into systems that can run inside daily operations remains a separate challenge. Cognizant, an IT services firm, is expanding its partnership with Google Cloud to help enterprises move from AI pilots to fully deployed, production-ready systems.

Cognizant and Google Cloud are deepening their collaboration around Google’s Gemini Enterprise and Google Workspace. Cognizant is deploying these tools across its own workforce first, using them to support internal productivity and collaboration. The idea is simple: test and refine the systems internally, then package similar capabilities for clients.

The focus of the partnership is what Cognizant calls “agentic AI.” In practical terms, this refers to AI systems that can plan, act and complete tasks with limited human input. Instead of generating isolated outputs, these systems are designed to fit into business workflows and carry out structured tasks.

To make that workable at scale, Cognizant is building delivery infrastructure around the technology. The company is setting up a dedicated Gemini Enterprise Center of Excellence and formalizing an Agent Development Lifecycle. This framework covers the full process, from early design and blueprinting to validation and production rollout. The aim is to give enterprises a clearer path from the AI concept to a deployed system.

Cognizant also plans to introduce a bundled productivity offering that combines Gemini Enterprise with Google Workspace. The targeted use cases are operational rather than experimental. These include collaborative content creation, supplier communications and other workflow-heavy processes that can be standardized and automated.

Beyond productivity tools, Cognizant is integrating Gemini into its broader service platforms. Through Cognizant Ignition, enabled by Gemini, the company supports early-stage discovery and prototyping while helping clients strengthen their data foundations. Its Agent Foundry platform provides pre-configured and no-code capabilities for specific use cases such as AI-powered contact centers and intelligent order management. These tools are designed to reduce the amount of custom development required for each deployment.

Scaling is another element of the strategy. Cognizant, a multi-year Google Cloud Data Partner of the Year award winner, says it will rely on a global network of Gemini-trained specialists to deliver these systems. The company is also expanding work tied to Google Distributed Cloud and showcasing capabilities through its Google Experience Zones and Gen AI Studios.

For Google Cloud, the partnership reinforces its enterprise AI ecosystem. Cloud providers can offer models and infrastructure, but enterprise adoption often depends on service partners that can integrate tools into existing systems and manage ongoing operations. By aligning closely with Cognizant, Google strengthens its ability to move Gemini from platform capability to production deployment.

The announcement does not introduce a new AI model. Instead, it reflects a shift in emphasis. The core question is no longer whether AI tools exist, but how they are implemented, governed and scaled across large organizations. Cognizant’s expanded role suggests that execution frameworks, internal deployment and structured delivery models are becoming central to how enterprises approach AI.

In that sense, the partnership is less about new technology and more about operational maturity. It highlights how AI is moving from isolated pilots to managed systems embedded in business processes — a transition that will likely define the next phase of enterprise adoption.