在Flask Rest API中预测已保存的机器学习模型时发生ValueError

时间:2018-11-01 20:01:37

标签: python tensorflow machine-learning flask keras

我已经使用model.save('filename.h5')训练并保存了模型。它可以进行预测,但是当我将相同的模型加载到flask_app中并尝试对其进行预测时,我会遇到ValueError:

相关内容

The data type of the acquired data from the post call: <class 'str'>

Preprocessing
Pre-processing complete
Data type of the input data: <class 'numpy.ndarray'>
Predicting for requirement:
[2018-11-01 15:34:56,261] ERROR in app: Exception on /predict [POST]
Traceback (most recent call last):
  File "C:\ProgramData\Anaconda3\lib\site-packages\flask\app.py", line 2292, in wsgi_app
    response = self.full_dispatch_request()
  File "C:\ProgramData\Anaconda3\lib\site-packages\flask\app.py", line 1815, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "C:\ProgramData\Anaconda3\lib\site-packages\flask_cors\extension.py", line 161, in wrapped_function
    return cors_after_request(app.make_response(f(*args, **kwargs)))
  File "C:\ProgramData\Anaconda3\lib\site-packages\flask\app.py", line 1718, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "C:\ProgramData\Anaconda3\lib\site-packages\flask\_compat.py", line 35, in reraise
    raise value
  File "C:\ProgramData\Anaconda3\lib\site-packages\flask\app.py", line 1813, in full_dispatch_request
    rv = self.dispatch_request()
  File "C:\ProgramData\Anaconda3\lib\site-packages\flask\app.py", line 1799, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "C:\Users\singroha\PycharmProjects\smart_classification_ML_model\flask_app\predict_app.py", line 99, in predict
    prediction = requirement_model.predict(r_predict_text)
  File "C:\ProgramData\Anaconda3\lib\site-packages\keras\engine\training.py", line 1164, in predict
    self._make_predict_function()
  File "C:\ProgramData\Anaconda3\lib\site-packages\keras\engine\training.py", line 554, in _make_predict_function
    **kwargs)
  File "C:\ProgramData\Anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py", line 2744, in function
    return Function(inputs, outputs, updates=updates, **kwargs)
  File "C:\ProgramData\Anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py", line 2546, in __init__
    with tf.control_dependencies(self.outputs):
  File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\framework\ops.py", line 5002, in control_dependencies
    return get_default_graph().control_dependencies(control_inputs)
  File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\framework\ops.py", line 4541, in control_dependencies
    c = self.as_graph_element(c)
  File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\framework\ops.py", line 3488, in as_graph_element
    return self._as_graph_element_locked(obj, allow_tensor, allow_operation)
  File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\framework\ops.py", line 3567, in _as_graph_element_locked
    raise ValueError("Tensor %s is not an element of this graph." % obj)
ValueError: Tensor Tensor("dense_7/Softmax:0", shape=(?, 2), dtype=float32) is not an element of this graph.

事物的流向:

  • 我运行的烧瓶服务器将加载保存的模型
  • 当它接收到来自客户端的请求时,predict方法被调用
  • 我对数据进行预处理以准备将其放入模型中
  • 然后我尝试预测并发生这种情况

我的想法和所做的事情:

  • 我以为只是我没有正确预处理数据
  • 因此,我腌制了预处理后的数据,然后返回训练模型,并使用该腌制的对象进行了预测,然后它起作用了。

我尝试使用_make_predict_function并尝试了一些我在互联网上找到的东西,但似乎对我没有用。非常感谢对此有任何见解。谢谢

以下是flask_app的预测函数:

@app.route("/predict", methods=["POST"])
# @cross_origin(origin='localhost',headers=['Content- Type','Authorization'])
def predict():
    post_message = request.get_json(force=True)

    # Logging
    print('Data message received from the client', post_message)
    print("\nThe data type of the acquired data from the post call:", type(post_message))

    requirement = []
    requirement.append(post_message)

    # Preprocessing data
    r_predict_text, (sent_text, s_predict_text) = preprocess_data(requirement)

    # print("\n\n printing preprocessed data:")
    # print(r_predict_text)

    # Saving the preprocessing data for varification
    # with open('text_req.pickle', 'wb') as pickle_file:
    #     pickle.dump(r_predict_text, pickle_file, protocol=pickle.HIGHEST_PROTOCOL)

    print('Data type of the input data:', type(r_predict_text))
    print('Predicting for requirement:')

    # Everything goes down at this point after receiving request
    prediction = requirement_model.predict(r_predict_text) 

    # Ignore this, for now, I was trying to see if the response gets send back if take out the predict function call
    prediction = 'SMART'

    response = {
        'prediction': {
            'SMART': prediction
        }
    }
    print('sending response to the client')
    return jsonify(response)

0 个答案:

没有答案