

GitHub - brain-research/self-attention-gan
source link: https://github.com/brain-research/self-attention-gan
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

README.md
Self-Attention GAN
Tensorflow implementation for reproducing main results in the paper Self-Attention Generative Adversarial Networks by Han Zhang, Ian Goodfellow, Dimitris Metaxas, Augustus Odena.
Dependencies
python 3.6
TensorFlow 1.5
Data
Download Imagenet dataset and preprocess the images into tfrecord files as instructed in improved gan. Put the tfrecord files into ./data
Training
The current batch size is 64x4=256. Larger batch size seems to give better performance. But it might need to find new hyperparameters for G&D learning rate. Note: It usually takes several weeks to train one million steps.
CUDA_VISIBLE_DEVICES=0,1,2,3 python train_imagenet.py --generator_type test --discriminator_type test --data_dir ./data
Evaluation
CUDA_VISIBLE_DEVICES=4 python eval_imagenet.py --generator_type test --data_dir ./data
Citing Self-attention GAN
If you find Self-attention GAN is useful in your research, please consider citing:
@article{Han18,
author = {Han Zhang and
Ian J. Goodfellow and
Dimitris N. Metaxas and
Augustus Odena},
title = {Self-Attention Generative Adversarial Networks},
year = {2018},
journal = {arXiv:1805.08318},
}
References
Recommend
-
86
README.md GAN in Numpy This is a very simple step by step implementation of GAN using only numpy. Without the use of GPU, it takes too much time to generate all the numbers. To get...
-
44
README.md Compare GAN code. This is the code that was used in "Are GANs Created Equal? A Large-Scale Study" paper (https://arxiv.org/abs/1711.10337
-
42
Inside AI Illustrated: Self-Attention Step-by-step guide to self-attention with illustrations and code
-
23
文章主要内容概览: 1. Seq2Seq以及注...
-
49
今天学习的是剑桥大学的同学 2017 年的工作《GRAPH ATTENTION NETWORKS》,目前引用数量超过 1100 次。 Attention 机制在...
-
12
Building your own Self-attention GANs A PyTorch implementation of SAGAN with MNIST and CelebA dataset Meme from imgflip.com GANs, as known as Generative Ad...
-
4
April 20, 2021 Research shows your brain needs breaks—Outlook and Microsoft Teams can help By Jared Spataro, Corporate Vic...
-
4
transformer 中: self-attention 部分是否需要进行 mask?精选 小白学CV · 发表于 2021-07-20 11:08:17 文章来源: 小白学CV的专栏 来源| 空字...
-
7
artificial intelligenceSelf-Taught AI Shows Similarities to How the Brain WorksSelf-supervised...
-
8
Self-Taught AI May Have a Lot in Common With the Human BrainNeural networks can use self-supervised learning to figure out what matters. This process might be what helps humans do the same.
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK