Expressive Character Animation

In order to create a successful animated story, the emotional state of a character must be staged so that it is unmistakable and clear. The viewer's perception of its facial expressions is key to successfully staging the emotion. Traditionally animators and automatic expression transfer systems rely on geometric markers and features modeled on human faces to create character expressions, yet these features do not accurately transfer to stylized character faces. Relying on human geometric features alone to generate stylized character expressions leads to expressions that are perceptually confusing or different from the intended expression. Our framework avoids these pitfalls by learning how to transfer human facial expressions to character expressions that are both perceptually consistent and geometrically correct.

Publications

[2]  Learning to Generate 3D Stylized Character Expressions from Humans
      Deepali Aneja, Bindita Chaudhari, Alex Colburn, Gary Faigin, Linda G. Shapiro, Barbara Mones
      WACV 2018
[1]  Modeling Stylized Character Expressions via Deep Learning
      Deepali Aneja, Alex Colburn, Gary Faigin, Linda G. Shapiro, Barbara Mones
      ACCV 2016 [Oral]

Posters

[1]  Learning Stylized Character Expressions from Humans
      Deepali Aneja, Alex Colburn, Gary Faigin, Linda G. Shapiro, Barbara Mones
      WiCV, CVPR 2017

Facial Expression Research Group Database (FERG-DB)

FERG-DB is a database of stylized characters with annotated facial expressions. The database contains 55767 annotated face images of six stylized characters (aia, bonnie, jules, malcolm, mery and ray). The characters were modeled using the MAYA software and rendered out in 2D to create the images.
Download the dataset!

People

If you have any questions, or just want to say hi, email us!