grpc.framework.interfaces.face.face.AbortionError:AbortionError(code = StatusCode.RESOURCE_EXHAUSTED,details =“收到的消息大于最大值

时间:2018-01-15 22:20:05

标签: grpc tensorflow-serving

我在keras中构建了一个分段模型,并希望在tensorflow服务中运行该模型。目前我能够导出模型并运行模型服务器,但是当我运行client.py文件时输出太大了。错误是:

  File "/home/.../serving/bazel-bin/tensorflow_serving/car_mask/mask_client.runfiles/tf_serving/tensorflow_serving/car_mask/mask_client.py", line 47, in <module>
    result = stub.Predict(request, 10.0)
  File "/usr/local/lib/python2.7/dist-packages/grpc/beta/_client_adaptations.py", line 310, in __call__
    self._request_serializer, self._response_deserializer)
  File "/usr/local/lib/python2.7/dist-packages/grpc/beta/_client_adaptations.py", line 196, in _blocking_unary_unary
    raise _abortion_error(rpc_error_call)
grpc.framework.interfaces.face.face.AbortionError: AbortionError(code=StatusCode.RESOURCE_EXHAUSTED, details="Received message larger than max (4194349 vs. 4194304)")

如何解决这个问题?有没有办法添加任何grpc选项来增加client.py文件中的邮件大小?谢谢!

1 个答案:

答案 0 :(得分:3)

你可以试试这个:

import grpc.beta.implementations
from grpc._cython import cygrpc

def insecure_channel(host, port):
        channel = grpc.insecure_channel(
            target=host if port is None else '%s:%d' % (host, port),
            options=[(cygrpc.ChannelArgKey.max_send_message_length, -1),
                     (cygrpc.ChannelArgKey.max_receive_message_length, -1)])
        return grpc.beta.implementations.Channel(channel)

来自here