In the beginning of my code, (outside the scope of a Session
), I've set my random seed -
np.random.seed(1)
tf.set_random_seed(1)
This is what my dropout definition looks like -
cell = tf.nn.rnn_cell.DropoutWrapper(cell, output_keep_prob=args.keep_prob, seed=1)
In my first experiment, I kept keep_prob=1
. All results obtained were deterministic. I'm running this on a multicore CPU.
In my second experiment, I set keep_prob=0.8
and I ran the same code two times. Each code had these statements,
sess.run(model.cost, feed)
sess.run(model.cost, feed)
Results for first code run -
(Pdb) sess.run(model.cost, feed)
4.9555049
(Pdb) sess.run(model.cost, feed)
4.9548969
Expected behaviour, since DropoutWrapper
uses random_uniform
.
Results for second code run -
(Pdb) sess.run(model.cost, feed)
4.9551616
(Pdb) sess.run(model.cost, feed)
4.9552417
Why is this sequence not identical to the first output despite defining an operation and graph seed?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…