Generative Animation and Emotes with Yassine Tahi, Kinetix CEO
One of the fastest growing areas of creative expression in games has been the emotes and animations you can apply to your character in games like Fortnite. Yassine Tahi saw this realm of self-expression coming, and created a platform that democratizes the creation of character animations.
With Kinetix, individual creators can capture movement and make their avatar animations available through a marketplace—and end-users can invoke them through text-to-animation generative prompting. This requires a huge number of challenging technology problems to be solved:
Motion capture with a normal camera that maps to the character rig
Adjusting animation to the geometry and physics of an environment
Labeling the data in a way that generative AI technology can make use of it
Making animations interoperable across different games, virtual worlds and metaverse platforms.
If you prefer to listen to the podcast, you can also listen to the conversation with Yassine Tahi here (although I think this is a good one to watch—some of the examples of video animation we included are worth viewing!)
Show Notes
Yassine’s company can be found at Kinetix.tech.
is currently partnering with some of these platforms and products:
Inworld — character AI platform
Zepeto — avatar-centric gaming and interactive platform
The Sandbox — voxel-based metaverse platform
NextDancer — game with dancing movement at its core
One of the related research papers Jon brought up was PADL (Language-directed, Physics-based character control). Watch the animation examples from PADL in this video: