Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
568 views
in Technique[技术] by (71.8m points)

python - Exporting a Keras model as a TF Estimator: trained model cannot be found

I encountered the following issue when trying to export a Keras model as a TensorFlow Estimator with the purpose of serving the model. Since the same problem also popped up in an answer to this question, I will illustrate what happens on a toy example and provide my workaround solution for documentation purposes. This behaviour occurs with Tensorflow 1.12.0 and Keras 2.2.4. This happens with actual Keras as well as with tf.keras.

The problem occurs when trying to export an Estimator that was created from a Keras model with tf.keras.estimator.model_to_estimator. Upon calling estimator.export_savedmodel, either a NotFoundError or a ValueError is thrown.

The below code reproduces this for a toy example.

Create a Keras model and save it:

import keras
model = keras.Sequential()
model.add(keras.layers.Dense(units=1,
                                activation='sigmoid',
                                input_shape=(10, )))
model.compile(loss='binary_crossentropy', optimizer='sgd')
model.save('./model.h5')

Next, convert the model to an estimator with tf.keras.estimator.model_to_estimator, add an input receiver function and export it in the Savedmodel format with estimator.export_savedmodel:

# Convert keras model to TF estimator
tf_files_path = './tf'
estimator =
    tf.keras.estimator.model_to_estimator(keras_model=model,
                                          model_dir=tf_files_path)
def serving_input_receiver_fn():
    return tf.estimator.export.build_raw_serving_input_receiver_fn(
        {model.input_names[0]: tf.placeholder(tf.float32, shape=[None, 10])})

# Export the estimator
export_path = './export'
estimator.export_savedmodel(
    export_path,
    serving_input_receiver_fn=serving_input_receiver_fn())

This will throw:

ValueError: Couldn't find trained model at ./tf.
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

My workaround solution is as follows. Inspecting the ./tf folder makes clear that the call to model_to_estimator stored the necessary files in a keras subfolder, while export_model expects those files to be in the ./tf folder directly, as this is the path we specified for the model_dir argument:

$ tree ./tf
./tf
└── keras
    ├── checkpoint
    ├── keras_model.ckpt.data-00000-of-00001
    ├── keras_model.ckpt.index
    └── keras_model.ckpt.meta

1 directory, 4 files

The simple workaround is to move these files up one folder. This can be done with Python:

import os
import shutil
from pathlib import Path

def up_one_dir(path):
    """Move all files in path up one folder, and delete the empty folder
    """
    parent_dir = str(Path(path).parents[0])
    for f in os.listdir(path):
        shutil.move(os.path.join(path, f), parent_dir)
    shutil.rmtree(path)

up_one_dir('./tf/keras')

Which will make the model_dir directory look like this:

$ tree ./tf
./tf
├── checkpoint
├── keras_model.ckpt.data-00000-of-00001
├── keras_model.ckpt.index
└── keras_model.ckpt.meta

0 directories, 4 files

Doing this manipulation in between the model_to_estimator and the export_savedmodel calls allows to export the model as desired:

export_path = './export'
estimator.export_savedmodel(
    export_path,
    serving_input_receiver_fn=serving_input_receiver_fn())

INFO:tensorflow:SavedModel written to: ./export/temp-b'1549796240'/saved_model.pb


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...