I am trying to add another parameter to Keras's implementation of deep learning architecture which changes at each or after a number of epochs.
Assume in new architecture (CNN, RNN, etc.), a parameter 'alpha1' is added, and I want to initialize it with a value for example 16,
Now, at the time of training, at each epoch, I want to update the alpha1. Suppose, at each epoch, alpha1 = alpha1 * somevalue.
Since in keras/../recurrent.py, step(.) function where the computations are made is only called one time (not at every epoch), I could not add the
update of a parameter in here.
Is there any way of updating a parameter in keras model during training?
question from:
https://stackoverflow.com/questions/65909353/how-to-update-a-parameter-i-e-dropout-rate-or-units-at-each-epoch-within-a-ke 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…