如何从AWS PySpark中加载的python包中加载本地资源

时间:2020-05-05 17:31:48

标签: python pyspark amazon-emr aws-glue

我已使用PySpark将python软件包上传到AWS EMR。我的python程序包具有如下结构,其中程序包中有一个资源文件(sklearn joblib模型):

myetllib
    ├── Dockerfile
    ├── __init__.py
    ├── modules
    │   ├── bin
    │   ├── joblib
    │   ├── joblib-0.14.1.dist-info
    │   ├── numpy
    │   ├── numpy-1.18.4.dist-info
    │   ├── numpy.libs
    │   ├── scikit_learn-0.21.3.dist-info
    │   ├── scipy
    │   ├── scipy-1.4.1.dist-info
    │   └── sklearn
    ├── requirements.txt
    └── mysubmodule
        ├── __init__.py
        ├── model.py
        └── models/mymodel.joblib

然后我压缩包裹并上传到EMR。现在,我可以在控制台中导入model.py,就像

from myetllib.mysubmodule.model import load_model, run_model

但是当我致电load_model时,我得到一个错误,作业库抱怨找不到软件包资源文件,即models/mymodel.joblib 路径正确设置为

import joblib

BASE_PATH = os.path.join(os.path.dirname(os.path.realpath(__file__)))
MODEL_PATH =  os.path.join(BASE_PATH,"models/my_model.joblib")

def load_model():
    '''
        load scikit-learn model via joblib
    '''
    with warnings.catch_warnings():
        warnings.filterwarnings('ignore', category=UserWarning)
        return joblib.load(MODEL_PATH)

错误就像

NotADirectoryError: [Errno 20] Not a directory: '/mnt/tmp/spark-fc45e56b-06f3-56dd-af44-0ecc93d4gc0d/userFiles-1e3455-a6rf-4adc-592b-bbe41ffa323/etllib-v1.0.0.zip/etllib/mysubmodule/models/my_model.joblib

另外,我从sklearn中收到另一个错误:

NotADirectoryError: [Errno 20] Not a directory: '/mnt/tmp/spark-904b50d2-0407-43e8-bb46-06a7b334a46b/userFiles-5df387de-066e-498a-8dd3-e8329d0e8252/etllib-v1.0.1.zip/etllib/modules/sklearn/__check_build

0 个答案:

没有答案