我在 Google Colab 上使用对象检测 API 训练了一个 Mobilenet Tensorflow 模型。完成训练后,我保存保存的模型并在 Python 中运行模型。该模型工作正常,并按预期检测对象。
要将检查点转换为 tensorflow 保存的模型:
python /content/models/research/object_detection/exporter_main_v2.py \
--trained_checkpoint_dir {model_dir} \
--output_directory {output_directory} \
--pipeline_config_path {pipeline_file}
我使用以下方法在 Python 中加载和预测模型:
model = tf.keras.models.load_model("path....")
def run_inference_image(model, image):
image = np.asarray(image)
input_tensor = tf.convert_to_tensor(image)
input_tensor = input_tensor[tf.newaxis, ...]
output_dict = model(input_tensor)
num_detections = int(output_dict.pop('num_detections'))
output_dict = {key: value[0, :num_detections].numpy()
for key, value in output_dict.items()}
output_dict['num_detections'] = num_detectionss.
output_dict['detection_classes'] = output_dict['detection_classes'].astype(
np.int64)
return output_dict
要将模型转换为 tensorflowjs,我正在运行以下命令:
tensorflowjs_converter --control_flow_v2=True --input_format=tf_saved_model --metadata= --saved_model_tags=serve --signature_name=serving_default --strip_debug_ops=True --weight_shard_size_bytes=4194304 ./saved_model ./web_model
转换按预期工作正常,没有抛出错误。
问题发生在我尝试使用 tensorflowjs 的模式时:
const model = await loadGraphModel("http://127.0.0.1:8080/model.json");
// random image so I can test
model.executeAsync(tf.zeros([1,320,320,3]).toInt()).then(predictions => {
console.log(predictions)
tf.engine().endScope();
});
此时抛出异常:
util_base.js:107 Uncaught (in promise) Error: TensorList shape mismatch: Shapes -1 and 3 must match
at Module.assert (util_base.js:107)
at assertShapesMatchAllowUndefinedSize (tensor_utils.js:26)
at TensorList.setItem (tensor_list.js:239)
at _callee2$ (control_executor.js:309)
at tryCatch (runtime.js:63)
at Generator.invoke [as _invoke] (runtime.js:282)
at Generator.prototype.<computed> [as next] (runtime.js:116)
at asyncGeneratorStep (asyncToGenerator.js:3)
at _next (asyncToGenerator.js:25)
at asyncToGenerator.js:32
at new Promise (<anonymous>)
at Module.<anonymous> (asyncToGenerator.js:21)
at Module.executeOp (control_executor.js:411)
at operation_executor.js:59
at executeOp (operation_executor.js:139)
at _loop (graph_executor.js:543)
at GraphExecutor.processStack (graph_executor.js:579)
at GraphExecutor._callee4$ (graph_executor.js:469)
at tryCatch (runtime.js:63)
at Generator.invoke [as _invoke] (runtime.js:282)
at Generator.prototype.<computed> [as next] (runtime.js:116)
at asyncGeneratorStep (asyncToGenerator.js:3)
at _next (asyncToGenerator.js:25)
at asyncToGenerator.js:32
at new Promise (<anonymous>)
at GraphExecutor.<anonymous> (asyncToGenerator.js:21)
at GraphExecutor.executeWithControlFlow (graph_executor.js:513)
at GraphExecutor._callee2$ (graph_executor.js:326)
at tryCatch (runtime.js:63)
at Generator.invoke [as _invoke] (runtime.js:282)
at Generator.prototype.<computed> [as next] (runtime.js:116)
at asyncGeneratorStep (asyncToGenerator.js:3)
at _next (asyncToGenerator.js:25)
at asyncToGenerator.js:32
at new Promise (<anonymous>)
at GraphExecutor.<anonymous> (asyncToGenerator.js:21)
at GraphExecutor._executeAsync (graph_executor.js:365)
at GraphExecutor._callee3$ (graph_executor.js:385)
at tryCatch (runtime.js:63)
at Generator.invoke [as _invoke] (runtime.js:282)
at Generator.prototype.<computed> [as next] (runtime.js:116)
at asyncGeneratorStep (asyncToGenerator.js:3)
at _next (asyncToGenerator.js:25)
at asyncToGenerator.js:32
at new Promise (<anonymous>)
at GraphExecutor.<anonymous> (asyncToGenerator.js:21)
at GraphExecutor.executeFunctionAsync (graph_executor.js:396)
at _loop$ (control_executor.js:99)
at tryCatch (runtime.js:63)
at Generator.invoke [as _invoke] (runtime.js:282)
我曾尝试同时使用 tensorflowjs ^2.0.0 和 tensorflowjs ^3.0.0,但出现相同的错误。我已经训练了两种不同的模型 SSD MobileNet V2 FPNLite 320x320 和 SSD MobileNet v2 320x320,两者都发生了模型。
我也尝试使用 tensorflowjs 2 和 3 转换模型,但无济于事。