128
GitHub - keon/seq2seq: Minimal Seq2Seq model with Attention for Neural Machine T...
source link: https://github.com/keon/seq2seq
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
mini seq2seq
Minimal Seq2Seq model with attention for neural machine translation in PyTorch.
This implementation focuses on the following features:
- Modular structure to be used in other projects
- Minimal code for readability
- Full utilization of batches and GPU.
This implementation relies on torchtext to minimize dataset management and preprocessing parts.
Model description
- Encoder: Bidirectional GRU
- Decoder: GRU with Attention Mechanism
- Attention: Neural Machine Translation by Jointly Learning to Align and Translate
Requirements
- GPU & CUDA
- Python3
- PyTorch
- torchtext
- Spacy
- numpy
- Visdom (optional)
download tokenizers by doing so:
python -m spacy download de
python -m spacy download en
References
Based on the following implementations
Recommend
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK