Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
676 views
in Technique[技术] by (71.8m points)

python - Tensorflow Probability - negative log likelihood

I am training a Multivariate Normal from TFP using a probabilistic model as shown below:

def NLL(y_true, y_pred):
    return -y_pred.log_prob(y_true)

output_dim = 250
model = Sequential([
    Input(shape=(5,)),
    Dense(tfpl.MultivariateNormalTriL.params_size(output_dim)),
    tfpl.MultivariateNormalTriL(output_dim)
])
model.compile(loss=NLL, optimizer=Adam(lr=1e-3, clipvalue=0.25))

The training works using Adam and clipvalue=0.25. However, I am still getting very high loss values sometimes when optimizing the model. Is there any useful trick to gain more numerical stability?

Any help is highly appreciated. Thanks

question from:https://stackoverflow.com/questions/65904164/tensorflow-probability-negative-log-likelihood

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)
Waitting for answers

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

1.4m articles

1.4m replys

5 comments

56.9k users

...