91
GitHub - CyberZHG/keras-radam: RAdam implemented in Keras
source link: https://github.com/CyberZHG/keras-radam
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
README.md
Keras RAdam
Unofficial implementation of RAdam.
Install
pip install keras-rectified-adam
Usage
import keras import numpy as np from keras_radam import RAdam # Build toy model with RAdam optimizer model = keras.models.Sequential() model.add(keras.layers.Dense(input_shape=(17,), units=3)) model.compile(RAdam(), loss='mse') # Generate toy data x = np.random.standard_normal((4096 * 30, 17)) w = np.random.standard_normal((17, 3)) y = np.dot(x, w) # Fit model.fit(x, y, epochs=5)
Use Warmup
from keras_radam import RAdam RAdam(total_steps=10000, warmup_proportion=0.1, min_lr=1e-5)
Q & A
About Correctness
The optimizer produces similar losses and weights to the official optimizer after 500 steps.
Use tf.keras
or tf-2.0
Add TF_KERAS=1
to environment variables to use tensorflow.python.keras
.
Use theano
Backend
Add KERAS_BACKEND=theano
to environment variables to enable theano
backend.
Recommend
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK