Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
1.7k views
in Technique[技术] by (71.8m points)

tensorflow - `tape` is required when a `Tensor` loss is passed

Some question about tf.

import numpy as np
import tensorflow as tf
from tensorflow import keras

x_train = [1,2,3]
y_train = [1,2,3]

W = tf.Variable(tf.random.normal([1]), name = 'weight')
b = tf.Variable(tf.random.normal([1]), name = 'bias')
hypothesis = W*x_train+b

optimizer = tf.optimizers.SGD (learning_rate=0.01)

train = tf.keras.optimizers.Adam().minimize(cost, var_list=[W, b])

As I start the last line of my code, below error comes out.

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-52-cd6e22f66d09> in <module>()
----> 1 train = tf.keras.optimizers.Adam().minimize(cost, var_list=[W, b])

1 frames
/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/optimizer_v2/optimizer_v2.py in _compute_gradients(self, loss, var_list, grad_loss, tape)
    530     # TODO(josh11b): Test that we handle weight decay in a reasonable way.
    531     if not callable(loss) and tape is None:
--> 532       raise ValueError("`tape` is required when a `Tensor` loss is passed.")
    533     tape = tape if tape is not None else backprop.GradientTape()
    534 

ValueError: `tape` is required when a `Tensor` loss is passed.

I know it is related with tensorflow version 2, but don't want to modify to version 1.

Want a solution with tensorflow ver2. Thx.

question from:https://stackoverflow.com/questions/65913108/tape-is-required-when-a-tensor-loss-is-passed

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

Since you did not provided the cost function, I added one. Here is the code

import numpy as np
import tensorflow as tf
from tensorflow import keras

 
x_train = [1,2,3]
y_train = [1,2,3]

W = tf.Variable(tf.random.normal([1]), name = 'weight')
b = tf.Variable(tf.random.normal([1]), name = 'bias')
hypothesis = W*x_train+b

@tf.function
def cost():

    y_model = W*x_train+b
    error = tf.reduce_mean(tf.square(y_train- y_model))
    return error


optimizer = tf.optimizers.SGD (learning_rate=0.01)

train = tf.keras.optimizers.Adam().minimize(cost, var_list=[W, b])

tf.print(W)
tf.print(b)

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...