GitHub - openai/gpt-2: Code for the paper "Language Models are Unsupervised...
source link: https://github.com/openai/gpt-2
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
README.md
gpt-2
Code and samples from the paper "Language Models are Unsupervised Multitask Learners".
For now, we have only released a smaller (117M parameter) version of GPT-2.
See more details in our blog post.
Installation
Download the model data (needs gsutil):
sh download_model.sh 117M
Install python packages:
pip3 install -r requirements.txt
Unconditional sample generation
WARNING: Samples are unfiltered and may contain offensive content.To generate unconditional samples from the small model:
python3 src/generate_unconditional_samples.py | tee samples
There are various flags for controlling the samples:
python3 src/generate_unconditional_samples.py --top_k 40 --temperature 0.7 | tee samples
While we have not yet released GPT-2 itself, you can see some unconditional samples from it (with default settings of temperature 1 and no truncation) in gpt2-samples.txt
.
Conditional sample generation
To give the model custom prompts, you can use:
python3 src/interactive_conditional_samples.py
Future work
We may release code for evaluating the models on various benchmarks.
We are still considering release of the larger models.
Recommend
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK