I am fairly new to TensorFlow and right now looking into the custom op development. I have already read the official tutorial but I feel a lot of things happen behind the scenes and I do not always want to put my custom ops in user_ops directory.
As such, I took up an example word2vec
which uses a custom "Skipgram" op whose registration is defined here:
/word2vec_ops.cc
and whose kernel implementation is here:
/word2vec_kernels.cc
Looking at the build file, I tried to build individual targets
1) bazel build -c opt tensorflow/models/embedding:word2vec_ops
This generates bunch of object files as expected.
2) bazel build -c opt tensorflow/models/embedding:word2vec_kernels
Same for this.
3) bazel build -c opt tensorflow/models/embedding:word2vec_kernels:gen_word2vec
This last build uses a custom rule namely tf_op_gen_wrapper_py
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/tensorflow.bzl#L197-L231
Interesting to note that this only depends on Op Registration and not on the kernel itself.
After all above, if I build py_binary
itself using
bazel build -c opt tensorflow/models/embedding:word2vec
it works fine, but I fail to see where and how the kernel c++ code linked?
Additionally, I would also like to understand the tf_op_gen_wrapper_py
rule and the whole compilation/linking procedure that goes behind the scenes for ops registration.
Thanks.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…