通过nodejs gRPC问题连接到tensorflow服务

时间:2017-07-07 03:30:30

标签: node.js machine-learning tensorflow grpc tensorflow-serving

我编写了使用示例编写的tensorflow,并使用以下命令行启动它:

root@tensorflow:/opt/serving# bazel-

bin/tensorflow_serving/model_servers/tensorflow_model_server --enable_batching --port=9000 --model_name=mnist --model_base_path=/tmp/monitored
2017-07-07 15:20:53.248475: I tensorflow_serving/model_servers/main.cc:155] Building single TensorFlow model file config:  model_name: mnist model_base_path: /tmp/monitored model_version_policy: 0
2017-07-07 15:20:53.249112: I tensorflow_serving/model_servers/server_core.cc:375] Adding/updating models.
2017-07-07 15:20:53.249149: I tensorflow_serving/model_servers/server_core.cc:421]  (Re-)adding model: mnist
2017-07-07 15:20:53.350232: I tensorflow_serving/core/basic_manager.cc:698] Successfully reserved resources to load servable {name: mnist version: 2}
2017-07-07 15:20:53.350375: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: mnist version: 2}
2017-07-07 15:20:53.350431: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: mnist version: 2}
2017-07-07 15:20:53.350985: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:360] Attempting to load native SavedModelBundle in bundle-shim from: /tmp/monitored/2
2017-07-07 15:20:53.351062: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:227] Loading SavedModel from: /tmp/monitored/2
2017-07-07 15:20:53.362362: W external/org_tensorflow/tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.1 instructions, but these are available on your machine and could speed up CPU computations.
2017-07-07 15:20:53.362431: W external/org_tensorflow/tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.2 instructions, but these are available on your machine and could speed up CPU computations.
2017-07-07 15:20:53.362447: W external/org_tensorflow/tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.
2017-07-07 15:20:53.517579: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:146] Restoring SavedModel bundle.
2017-07-07 15:20:53.538208: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:181] Running LegacyInitOp on SavedModel bundle.
2017-07-07 15:20:53.555332: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:275] Loading SavedModel: success. Took 204257 microseconds.
2017-07-07 15:20:53.555493: I tensorflow_serving/servables/tensorflow/saved_model_bundle_factory.cc:92] Wrapping session to perform batch processing
2017-07-07 15:20:53.556006: I tensorflow_serving/servables/tensorflow/bundle_factory_util.cc:153] Wrapping session to perform batch processing
2017-07-07 15:20:53.556751: I tensorflow_serving/core/loader_harness.cc:86] Successfully loaded servable version {name: mnist version: 2}
2017-07-07 15:20:53.591026: I tensorflow_serving/model_servers/main.cc:298] Running ModelServer at 0.0.0.0:9000 ...

使用有效的python脚本测试它

bazel-bin/tensorflow_serving/example/mnist_client --num_tests=1000 --server=localhost:9000

然后基于这个例子https://github.com/clarle/node-tensorflow-serving-demo我创建了这段代码:

const grpc       = require('grpc');
const image = require('./test/fixtures/image');

// TensorFlow Serving configuration settings
const config = require('./config');

// Load Protocol Buffers
const proto = grpc.load('./protos/mnist_inference.proto').tensorflow.serving;
const MnistService = proto.MnistService;

// Create TensorFlow Serving MNIST client
const client = new MnistService(config.TENSORFLOW_SERVING_HOST, grpc.credentials.createInsecure());

// Express middleware
//app.use(bodyParser.json());

const image_data = image;
client.classify({ image_data }, (err, mnistResponse) => {
    if (err) {
        // TODO: Implement actual error handler
        console.log(err)
    } else {
      console.log(mnistResponse)
    }
});

并且它不起作用返回:

{ Error
    at C:\tmp\node-tensorflow-serving-demo\node_modules\grpc\src\node\src\client.js:569:15 code: 12, metadata: Metadata { _internal_repr: {} } }

我不知道在哪里看或如何解决。原型文件不会更改:

https://github.com/clarle/node-tensorflow-serving-demo/blob/master/protos/mnist_inference.proto

非常感谢任何帮助。

0 个答案:

没有答案