Google Colab Notebook中的PySpark运行问题

时间:2020-11-08 10:29:28

标签: python apache-spark pyspark colab

在colab笔记本中运行以下代码后:

import findspark
findspark.init("E:/spark-3.0.1-bin-hadoop2.7")

import pyspark

出现此错误:

/usr/local/lib/python3.6/dist-packages/findspark.py in init(spark_home, python_path, edit_rc, edit_profile)
    142     try:
--> 143         py4j = glob(os.path.join(spark_python, "lib", "py4j-*.zip"))[0]
    144     except IndexError:

IndexError: list index out of range

During handling of the above exception, another exception occurred:

Exception                                 Traceback (most recent call last)
1 frames
/usr/local/lib/python3.6/dist-packages/findspark.py in init(spark_home, python_path, edit_rc, edit_profile)
    144     except IndexError:
    145         raise Exception(
--> 146             "Unable to find py4j, your SPARK_HOME may not be configured correctly"
    147         )
    148     sys.path[:0] = [spark_python, py4j]

Exception: Unable to find py4j, your SPARK_HOME may not be configured correctly

当我验证了spark文件夹时,我发现py4j模块已经存在!如下图所示: enter image description here

0 个答案:

没有答案