Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
373 views
in Technique[技术] by (71.8m points)

python - How to register a custom gradient for a operation composed of tf operations

More specifically I have a simple fprop that is a composition of tf operations. I want to override the tensorflow gradient computation with my own gradient method using RegisterGradient.

What's wrong with this code?

import tensorflow as tf
from tensorflow.python.framework import ops

@ops.RegisterGradient("MyopGrad")
def frop_grad(op, grad):
    x = op.inputs[0]
    return 0 * x  # zero out to see the difference:

def fprop(x):
    x = tf.sqrt(x)
    out = tf.maximum(x, .2)
    return out

a = tf.Variable(tf.constant([5., 4., 3., 2., 1.], dtype=tf.float32))
h = fprop(a)
h = tf.identity(h, name="Myop")
grad = tf.gradients(h, a)

g = tf.get_default_graph()
with g.gradient_override_map({'Myop': 'MyopGrad'}):
    with tf.Session() as sess:
        sess.run(tf.initialize_all_variables())
        result = sess.run(grad)

print(result[0])

I want to see all zeros in the print, but instead I am getting:

[ 0.2236068   0.25000003  0.28867513  0.35355341  0.5       ]
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

You need to define the op within the scope of with g.gradient_override_map({'Myop': 'MyopGrad'})

Also, you need to map Identity rather than the name Myop to your new gradient.

Here is the full code:

import tensorflow as tf
from tensorflow.python.framework import ops

@ops.RegisterGradient("MyopGrad")
def frop_grad(op, grad):
    x = op.inputs[0]
    return 0 * x  # zero out to see the difference:

def fprop(x):
    x = tf.sqrt(x)
    out = tf.maximum(x, .2)
    return out

a = tf.Variable(tf.constant([5., 4., 3., 2., 1.], dtype=tf.float32))
h = fprop(a)

g = tf.get_default_graph()
with g.gradient_override_map({'Identity': 'MyopGrad'}):
    h = tf.identity(h, name="Myop")
    grad = tf.gradients(h, a)

with tf.Session() as sess:
    sess.run(tf.initialize_all_variables())
    result = sess.run(grad)

print(result[0])

Output:

[ 0.  0.  0.  0.  0.]

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

1.4m articles

1.4m replys

5 comments

57.0k users

...