A GAN-based face reenactment technique for non-human game characters using coloring module
Facial Expression Reenactment, Game Character, Coloring Module, GAN
Hyoungbum KIM, Kyungha MIN1, Heekyung YANG
Recently, many progresses on deep learning-based facial expression reenactment are released. Applying the facial reenactment techniques to game characters should address a challenge of extracting action units (AUs) from non-human game characters. To extract AUs from non-human game characters effectively, we devise a coloring module that adjusts the face colors of the character to that of a normal human. Our framework is based on GAN. Our framework consists of a coloring module, two generators, two discriminators, and an identity preserving module. In the first, we adjust the face color using the coloring module and then generate a neutral face from the first generator according to the input AU. The second generator produces the reenacted face images from the neutral face. Finally, the coloring module applies the original face color to the reenacted facial images. We measure the quality of the generated image through the discriminators and preserve the identity through the identity preserving module. We generate reenacted images from various game characters and prove the excellency of our method.