Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
663 views
in Technique[技术] by (71.8m points)

Cannot convert Tensorflow .pb frozen graph to tensorflow lite due to strange 'utf-8' codec error on Colab

I have am ONNX model that I converted to tensorflow, that conversion went ahead without any problems, but now I want to convert this .pb file to tf lite using the following code

import tensorflow as tf

TF_PATH = "/content/tf_model/saved_model.pb" # where the forzen graph is stored
TFLITE_PATH = "./model.tflite"

# make a converter object from the saved tensorflow file
converter = tf.compat.v1.lite.TFLiteConverter.from_frozen_graph(TF_PATH,  # TensorFlow freezegraph .pb model file
                                                      input_arrays=['input_ids'], # name of input arrays as defined in torch.onnx.export function before.
                                                      output_arrays=['logits'], # name of output arrays defined in torch.onnx.export function before.
                                                      )


converter.experimental_new_converter = True

converter.target_spec.supported_ops = [tf.compat.v1.lite.OpsSet.TFLITE_BUILTINS,
                                       tf.compat.v1.lite.OpsSet.SELECT_TF_OPS]

tf_lite_model = converter.convert()
# Save the model.
with open(TFLITE_PATH, 'wb') as f:
    f.write(tf_lite_model)

But when I run this cell on Colab I get the error:

UnicodeDecodeError: 'utf-8' codec can't decode byte 0xa3 in position 3: invalid start byte

And directs towards the line: converter = tf.compat.v1.lite.TFLiteConverter.from_frozen_graph().

I cant seem to figure out what is causing this..


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

Looks like your frozen_graph is not frozen graph but saved_model format. If I guess right all you need is to change conversion method: you are looking for convert from SavedModel

Assuming you are using TF2 and it will be:

import tensorflow as tf

TF_PATH = "/content/tf_model/" # where the saved_model is stored - but folder name
TFLITE_PATH = "./model.tflite"

# make a converter object from the saved tensorflow file
converter = tf.lite.TFLiteConverter.from_saved_model(TF_PATH)

converter.experimental_new_converter = True

converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS,
                                       tf.lite.OpsSet.SELECT_TF_OPS]

tf_lite_model = converter.convert()
# Save the model.
open(TFLITE_PATH, "wb").write(tf_lite_model)

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...