如果图表的输入传递到占位符:
input_layer = tf.placeholder(tf.float32, [...], name="inputs")
具有此input_layer
的冻结图将具有名为"输入"的输入节点。我如何知道冻结Estimator图的输入节点的名称?它是模型函数中的第一层吗?它是模型函数的features参数的字典键的名称吗?
当我打印冻结后生成的图形def的节点时,我得到了这个候选输入图层名称:
# Generated by the numpy_input_fn
enqueue_input/random_shuffle_queue
random_shuffle_queue_DequeueMany/n
random_shuffle_queue_DequeueMany
# This is probably the input
inputs/shape
inputs
# More nodes here
...
更新
更多更新
我使用带有估算器的已保存模型检查了指南,我想出了这段代码:
input_graph_def = graph.as_graph_def(add_shapes=True)
input_layer = graph.get_operation_by_name('input_layer').outputs[0]
input_shape = input_layer.get_shape().as_list()[1:]
run_params['input_shape'] = input_shape
feature_spec = {'x': tf.FixedLenFeature(input_shape, input_layer.dtype)}
estimator = tf.estimator.Estimator(model_fn=_predict_model_fn,
params=run_params,
model_dir=checkpoint_dir)
def _serving_input_receiver_fn():
return tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)()
exported_model_path = estimator.export_savedmodel(checkpoint_dir, _serving_input_receiver_fn)
然而,当我运行它时,我遇到了这个错误:
File "... my module", line ..., in ...
exported_model_path = estimator.export_savedmodel(checkpoint_dir, _serving_inp
File "...\tensorflow\python\estimator\estimator.py", line 598, in export_savedmodel
serving_input_receiver.receiver_tensors_alternatives)
File "...\tensorflow\python\estimator\export\export.py", line 199, in build_all_signature_defs
'{}'.format(type(export_outputs)))
ValueError: export_outputs must be a dict and not<class 'NoneType'>
这里是_predict_model_fn:
def _predict_model_fn(features, mode, params):
features = features['x']
# features are passed through layers
features = _network_fn(features, mode, params)
# the output layer
outputs = _get_output(features, params["output_layer"], params["num_classes"])
predictions = {
"outputs": outputs
}
return _create_model_fn(mode, predictions=predictions)
def _create_model_fn(mode, predictions, loss=None, train_op=None, eval_metric_ops=None, training_hooks=None):
return tf.estimator.EstimatorSpec(mode=mode,
predictions=predictions,
loss=loss,
train_op=train_op,
eval_metric_ops=eval_metric_ops,
training_hooks=training_hooks)
显然,只要有人决定导出他们的模型,就必须在export_output
中提供EstimatorSpec
参数。有了它,_predict_model_fn
有这个return语句并将参数添加到_create_model_fn
:
return _create_model_fn(mode, predictions=predictions,
export_outputs={
"outputs": tf.estimator.export.PredictOutput(outputs)
})
答案 0 :(得分:1)
无法从图表中分辨哪一个是输入或输出张量。
您应该使用SavedModel功能。部分原因是生成模型的签名,该签名准确地说明哪个张量是输入,哪个是输出。
您可以使用相同的模型并使用不同的签名导出它。例如,一个人会拿一个协议缓冲区并给你一个概率回来,另一个人会拿一个字符串并给你一个空间嵌入。