The default value for tags is 'serve', and the default value for signature_keys is 'serving_default'. You can override it using the tags param in the python API
See
https://www.tensorflow.org/lite/api_docs/python/tf/lite/TFLiteConverter#from_saved_model
EDIT:
Adding details on the failure after passing the correct tags and signature keys.
EDIT2: Updated sample code
This looks like an old model. It is saved using an old version.
First, let's fix this saved model version issue.
You need to re-save it
MODEL_DIR = 'model_path'
SIGNATURE_KEYS = ['default']
SIGNATURE_TAGS = set()
saved_model = tf.saved_model.load(MODEL_DIR, tags=SIGNATURE_TAGS)
tf.saved_model.save(saved_model, 'new_model_path', signatures=saved_model.signatures)
# You can now convert like this.
converter = tf.lite.TFLiteConverter.from_saved_model(
'new_model_path', signature_keys=SIGNATURE_KEYS, tags=['serve'])
Now if you tried converting, you won't see this problem, but you will see new issue :)
From the error message log there are 2 points in the summary
Some ops are not supported by the native TFLite runtime, you can enable TF kernels fallback using TF Select. See instructions: https://www.tensorflow.org/lite/guide/ops_select
Flex ops: TensorArrayGatherV3, TensorArrayReadV3, TensorArrayScatterV3, TensorArraySizeV3, TensorArrayV3, TensorArrayWriteV3
and
Some ops in the model are custom ops, See instructions to implement custom ops: https://www.tensorflow.org/lite/guide/ops_custom
Custom ops: HashTableV2, LookupTableFindV2, LookupTableImportV2
The new issue is because this model is using ops that TFLite doesn't support at the moment.
Example, TensorArray,Hashtables.
Some of these ops can be supported using TF select mode, see here
The other ops "HashTableV2, LookupTableFindV2, LookupTableImportV2" are available in TFLite as custom ops.
See this answer on how to enable it.
Also, TFLite team is working on adding support for hashtable as builtin ops, so soon you won't need to do the extra steps.