作为对导出/导入模型的测试,我使用了here中所述的DNN分类器。
现在要导出它,我使用以下代码。
为了检查/理解此功能看到的输入,我想使用tf.Print
。我尝试了所有可以找到的建议tf.Print
,但是它们都不起作用。看来我的问题是在将打印内容添加到图形时,它所驻留的函数没有返回张量。
def serving_input_receiver_fn():
serialized_tf_example = tf.placeholder(tf.string, name='tf_example')
feature_configs = tf.feature_column.make_parse_example_spec(my_feature_columns)
features = tf.parse_example(serialized_tf_example, feature_configs)
receiver_tensors = {'examples': serialized_tf_example}
a = tf.Print(tf.estimator.export.ServingInputReceiver(features, receiver_tensors), [serialized_tf_example], message="serialized tf example:")
return a
exported = classifier.export_savedmodel(EXPORT_PATH, serving_input_receiver_fn)
如果我插入它,它似乎没有被评估:
serialized_tf_example = tf.placeholder(tf.string, name='tf_example')
serialized_tf_example = tf.Print(serialized_tf_example, [serialized_tf_example], message="serialized tf example:")
用此结果替换退货会导致错误:
a = tf.Print(tf.estimator.export.ServingInputReceiver(features, receiver_tensors), [serialized_tf_example], message="serialized tf example:")
return a
错误:
TypeError: Failed to convert object of type <class 'tensorflow.python.estimator.export.export.ServingInputReceiver'> to Tensor. Contents: ServingInputReceiver(features={'PetalLength': <tf.Tensor 'ParseExample/ParseExample:0' shape=(?, 1) dtype=float32>, 'PetalWidth': <tf.Tensor 'ParseExample/ParseExample:1' shape=(?, 1) dtype=float32>, 'SepalLength': <tf.Tensor 'ParseExample/ParseExample:2' shape=(?, 1) dtype=float32>, 'SepalWidth': <tf.Tensor 'ParseExample/ParseExample:3' shape=(?, 1) dtype=float32>}, receiver_tensors={'examples': <tf.Tensor 'tf_example:0' shape=<unknown> dtype=string>}, receiver_tensors_alternatives=None). Consider casting elements to a supported type.
在不中断代码的情况下,我不知道如何添加要评估的tf.Print
。看来我也无法同时返回serialized_tf_example
和tf.estimator.export.ServingInputReceiver(features, receiver_tensors)
函数,因为这会干扰将serving_input_receiver_fn
传递给export_savedmodel()