GitHub - albertpumarola/GANimation: GANimation: Anatomically-aware Facial Animat...
source link: https://github.com/albertpumarola/GANimation
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
README.md
GANimation: Anatomically-aware Facial Animation from a Single Image
Official implementation of GANimation. In this work we introduce a novel GAN conditioning scheme based on Action Units (AU) annotations, which describe in a continuous manifold the anatomical facial movements defining a human expression. Our approach permits controlling the magnitude of activation of each AU and combine several of them. For more information please refer to the paper.
Prerequisites
- Install PyTorch, Torch Vision and dependencies from http://pytorch.org
- Install requirements.txt (
pip install -r requirements.txt
)
Data Preparation
The code requires a directory containing the following files:
imgs/
: folder with all imageaus_openpose.pkl
: dictionary containing the images action units.train_ids.csv
: file containing the images names to be used to train.test_ids.csv
: file containing the images names to be used to test.
An example of this directory is shown in sample_dataset/
.
To generate the aus_openface.pkl
extract each image Action Units with OpenFace and store each output in a csv file the same name as the image. Then run:
python data/prepare_au_annotations.py
Run
To train:
bash launch/run_train.sh
To test:
python test --input_path path/to/img
Citation
If you use this code or ideas from the paper for your research, please cite our paper:
@inproceedings{pumarola2018ganimation,
title={{GANimation: Anatomically-aware Facial Animation from a Single Image}},
author={A. Pumarola and A. Agudo and A.M. Martinez and A. Sanfeliu and F. Moreno-Noguer},
booktitle={ECCV},
year={2018}
}
Recommend
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK