Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
398 views
in Technique[技术] by (71.8m points)

tensorflow - Tensorflow2 learning rate schedular

I have implemented a custom training loop for my LSTM model,

with tf.GradientTape() as tape:
    y_pred = model(x_batch, training=True)   
    loss_value = loss_fn(y_true=y_batch, y_pred=y_pred)
# calculate gradients
gradients = tape.gradient(loss_value, model.trainable_variables) 
# clip the gradients
gradients, _ = tf.clip_by_global_norm(gradients, 5.0)
# update weights
optimizer.apply_gradients(zip(gradients, model.trainable_variables))

and i'm trying to using the learning rate scheduler

boundaries = [100000, 110000]
lr_values = [1.0, 0.5, 0.1]
tf.keras.optimizers.schedules.PiecewiseConstantDecay(boundaries, lr_values)

However I get this error message when it call optimizer.apply_gradients(zip(gradients, model.trainable_variables)):

TypeError: To be compatible with tf.contrib.eager.defun, Python functions must return zero or more Tensors; in compilation of <function train_model at 0x2ba5bde90dc0>, found return value of type <class 'tensorflow.python.keras.optimizer_v2.adam.Adam'>, which is not a Tensor.
question from:https://stackoverflow.com/questions/65936476/tensorflow2-learning-rate-schedular

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)
Waitting for answers

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...