Richard Chaney
Dublin's production company Piranha Bar has developed AniMotion, the first performance capture technology to enable an actor’s facial expression and body movement to be translated and rendered in real time as the digital character. The tech was debuted by Juventus at the "Team Jay" stand at Lucca Junior in Italy, as part of the Lucca Comics and Games festival. Richard Chaney, Creative Director at the company talked exclusively with Señal News about the advantages of this new technology.
How AniMotion works in real time?
"We use the power and versatility of Unreal Engine to bring together our 'Team Jay' characters and various well-known 'Team Jay' environments, with the performance capture hardware and software we collectively call AniMotion. The result: real-time, live renders with zero lag. Our characters look, sound, and perform just like the animated characters that the fans know and love—but now, they can talk to and interact with them in real time. Bespoke Unreal Engine character rigs take on motion data in real-time from the sensor-less motion capture suit, facial rig, and gloves. The character’s motion is rendered—still in real time—without any loss of quality or production value compared to what fans are used to seeing in the show."
What equipment is required to implement this technology and how it is used?
"Our AniMotion Live team travels light. Apart from needing some serious computing power on-site, an overnight bag of gear is all it takes: a suit, headgear, gloves, and sensors, and we’re good to go. Oh and a small handful of talented and knowledgeable artists and tech-heads. But all this is only possible thanks to over a decade of R&D we’ve put into the evolution of performance capture technology and the ongoing innovations we now call AniMotion Live.
Ways that audiences can engage with the characters and examples of interesting ways it could be used by different IPs?
"We see myriad possibilities for AniMotion Live to bring characters closer to their fans. The live event we did at for 'Team Jay' at Lucca Comics & Games in November 2024 demonstrated the power of live interactions—we witnessed the moment the penny dropped for kids (and their parents) as they realised that this character they know and love, previously remote and detached inside their devices, was now speaking directly to them. They asked him questions, he answered them, they showed him a dance move, and he copied them. It was beautiful to see. We also see potential in a wide range of live interactive contexts. Digital characters—whether 3D or 2D, humanoid or stylised—can do what any real human character can: have a live online presence, do live social posts or streams on YouTube or Twitch, and become virtual influencers in their own right. They can be interviewed live on video streams or TV at a moment’s notice, do promotional tours for their movies and series, and even commentate on sports broadcasts or become respected pundits. There’s also massive potential for brands to embrace this tech to create forms of entertainment that powerfully and immediately engage audiences. Animated characters can form long-term relationships with their consumers—they don’t age, and they don’t throw tantrums."
Why this technology enables animation to become more tactical, topical, and reactionary?
"Once the investment is made to make a digital character AniMotion Live-compatible and all the setup is complete, you now have an animated property that’s ready to interact with the world around them. It’s key to have writers and actors prepared to come up with smart, topical, and reactionary ideas and content that plugs into emerging trends, memes, current affairs, or sports events—or communicates tactically in branded entertainment contexts."
How this can be achieved with animation despite the medium’s reputation as a laborious and time-consuming form of entertainment?
"AniMotion is not only for live content. We often use AniMotion to create animation with a much faster turnaround than traditional linear animation (which remains king when it comes to our linear episodes). We subscribe to a 'form follows function' approach, where quick-turnaround content—whose value lies in its immediacy or interactivity—embraces the flow and agility of this technology."
How this is done without sacrificing quality and high production values?
"We don’t believe it’s a question of downgrading quality—whether visually, in writing, or in performance. It’s more about being smart with durations, complexity, number of characters, props, etc. Yes, the nuances of the performer driving the character may differ slightly from the painstaking beauty of linear animation. But being strategic about how much of the character is shown, the complexity of their choreography, and how many characters they interact with can mitigate these compromises. In terms of render quality and production values, there needn’t be any difference between the rendered look of the show and how the characters appear in real time. However, smart containment of the environment’s extent and complexity makes a big difference—again, form following function."