Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
387 views
in Technique[技术] by (71.8m points)

python - Tensorflow: Using Adam optimizer

I am experimenting with some simple models in tensorflow, including one that looks very similar to the first MNIST for ML Beginners example, but with a somewhat larger dimensionality. I am able to use the gradient descent optimizer with no problems, getting good enough convergence. When I try to use the ADAM optimizer, I get errors like this:

tensorflow.python.framework.errors.FailedPreconditionError: Attempting to use uninitialized value Variable_21/Adam
     [[Node: Adam_2/update_Variable_21/ApplyAdam = ApplyAdam[T=DT_FLOAT, use_locking=false, _device="/job:localhost/replica:0/task:0/cpu:0"](Variable_21, Variable_21/Adam, Variable_21/Adam_1, beta1_power_2, beta2_power_2, Adam_2/learning_rate, Adam_2/beta1, Adam_2/beta2, Adam_2/epsilon, gradients_11/add_10_grad/tuple/control_dependency_1)]]

where the specific variable that complains about being uninitialized changes depending on the run. What does this error mean? And what does it suggest is wrong? It seems to occur regardless of the learning rate I use.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

The AdamOptimizer class creates additional variables, called "slots", to hold values for the "m" and "v" accumulators.

See the source here if you're curious, it's actually quite readable: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/training/adam.py#L39 . Other optimizers, such as Momentum and Adagrad use slots too.

These variables must be initialized before you can train a model.

The normal way to initialize variables is to call tf.initialize_all_variables() which adds ops to initialize the variables present in the graph when it is called.

(Aside: unlike its name suggests, initialize_all_variables() does not initialize anything, it only add ops that will initialize the variables when run.)

What you must do is call initialize_all_variables() after you have added the optimizer:

...build your model...
# Add the optimizer
train_op = tf.train.AdamOptimizer(1e-4).minimize(cross_entropy)
# Add the ops to initialize variables.  These will include 
# the optimizer slots added by AdamOptimizer().
init_op = tf.initialize_all_variables()

# launch the graph in a session
sess = tf.Session()
# Actually intialize the variables
sess.run(init_op)
# now train your model
for ...:
  sess.run(train_op)

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...