AI actor Tilly Norwood releases a musical video arguing that artificial intelligence can expand creativity in film
Updated
March 13, 2026 2:18 PM

AI Actor Tilly Norwood. PHOTO: INSTAGRAM@TILLYNORWOOD
As Hollywood prepares for this weekend’s Oscars, a different kind of performer is stepping into the spotlight — one that doesn’t physically exist.
Tilly Norwood, described as the world’s first AI actor, has released her debut musical comedy video, Take the Lead. The project arrives at a moment when artificial intelligence has become one of the most contentious topics in the film industry.
The message of the song is simple. AI should not be seen as a threat to actors. Instead, it can become another creative tool. The release also offers a first look at what Norwood’s creators call the “Tillyverse”. It is envisioned as a cloud-based entertainment world where AI characters can live, interact and perform.
Behind the character is actor and producer Eline van der Velden. She is the CEO of production company Particle6 and AI talent studio Xicoia. Van der Velden created Tilly as a way to experiment with how artificial intelligence could be used in storytelling.
The timing is not accidental. The entertainment industry has spent the past few years debating the role AI should play in filmmaking and acting. Questions about digital replicas, automated performances and creative ownership continue to divide artists and studios.
Norwood’s musical video enters that debate with a different tone. Instead of warning about AI replacing actors, the project suggests that the technology could expand what performers are able to do.
The video itself also serves as a technical experiment. The song Take the Lead was generated using the AI music platform Suno. The video was then produced using a combination of widely available AI tools and Particle6’s own creative process.
One of the newer techniques used in the project is performance capture. Van der Velden physically acted out Tilly’s movements and expressions so the digital character could mirror a human performance. But the production was far from automated. According to Particle6, a team of 18 people worked on the video. The group included a director, editor, production designer, costume designer, comedy writer and creative technologist. In other words, the project still relied heavily on human creativity.
“Tilly has always been a vehicle to test the creative capabilities and boundaries of AI,” van der Velden said. “It’s not about taking anyone’s job”. She added that even with powerful tools, good AI content still takes time, taste and creative direction.
The project also reflects how quickly production technology is evolving. Tools that once required large studios are now accessible to smaller creative teams experimenting with AI-driven storytelling.
For Particle6, the character of Tilly Norwood acts as a testing ground. Each project explores how AI performers might be developed, directed and integrated into entertainment. Whether audiences embrace digital actors remains an open question. Many in the industry are still wary of how AI could reshape creative work.
But projects like Take the Lead show another possibility. Instead of replacing performers, artificial intelligence could become part of the creative process itself. In that sense, Tilly Norwood may represent something more than a virtual performer. She is also an experiment in how humans and machines might collaborate in the future of entertainment.
Keep Reading
A wearable ring, conversational AI and US$23M in funding. Sandbar wants to rethink how we interact with technology
Updated
March 12, 2026 5:59 PM

Sandbar's Stream ring. PHOTO: SANDBAR
Sandbar, a New York–based interface startup, has raised US$23 million in Series A funding to develop a wearable device that lets people interact with artificial intelligence via voice rather than screens.
Adjacent and Kindred Ventures led the round; both venture firms focused on early-stage technology startups. The investment brings Sandbar’s total funding to us$36 million. Earlier backing included a US$10 million seed round led by True Ventures, a venture capital firm, as well as a US$3 million pre-seed round supported by Upfront Ventures, a venture firm and Betaworks, a startup studio and investment firm.
Sandbar was founded by Mina Fahmi and Kirak Hong, who previously worked together at CTRL-labs, a neural interface startup acquired by Meta in 2019. Their earlier work explored how computers could respond more directly to human intent — an idea that continues to shape Sandbar’s approach to AI interfaces.
The new funding will help the company expand its team across machine learning, interaction design and software engineering as it prepares to launch its first product. That product, called Stream, combines a wearable ring with a conversational AI interface. The system allows users to speak to an AI assistant without unlocking a phone or opening an app.
The concept is simple. Instead of typing into a screen, users press a button on the ring and talk. The system can capture notes, organize ideas, retrieve information from the web or trigger actions through connected applications.
The ring includes a microphone, a touchpad and subtle haptic feedback. These elements allow the device to respond through gentle vibrations rather than visual alerts. According to the company, the ring only listens when the user presses the button — a design meant to address common concerns around always-on microphones.
That design reflects a larger shift Sandbar believes is underway. As AI assistants become more capable, many startups are experimenting with new ways to interact with them. The focus is moving away from screens and keyboards toward interfaces that feel more natural and immediate.
Stream uses multiple AI models working together to process requests, search the web and structure information in real time. The company says users remain in control of their data and can choose whether to share information with other apps.
Sandbar is also developing a feature called Inner Voice, which responds using a voice customized to the user. The feature will debut during a closed beta planned for this spring, giving the company time to refine how the software behaves in everyday use.
The startup currently employs a team of 15 people. Many have worked on well-known consumer devices including the iPhone, Fitbit, Kindle and Vision Pro. Recent hires include Sam Bowen, formerly of Amazon and Fitbit, who joined as vice president of hardware and Brooke Travis, previously at Equinox, Dior and Gap, who now leads marketing.
Sandbar plans to begin shipping Stream in summer 2026 after completing early testing. As artificial intelligence tools become more integrated into daily life, the company is betting that the next shift in computing will not come from another app — but from new ways for people to interact with AI itself.