I want to calculate Jacobian matrix by Tensorflow.
What I have:
def compute_grads(fn, vars, data_num):
grads = []
for n in range(0, data_num):
for v in vars:
grads.append(tf.gradients(tf.slice(fn, [n, 0], [1, 1]), v)[0])
return tf.reshape(tf.stack(grads), shape=[data_num, -1])
fn
is a loss function, vars
are all trainable variables, and data_num
is a number of data.
But if we increase the number of data, it takes tremendous time to run the function compute_grads
.
Any ideas?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…