遇到错误在pycharm上使用python执行spark程序时

时间:2017-08-05 04:46:32

标签: python apache-spark pyspark pycharm

我在PyCharm上写了一个名为Wordcount.py的python文件。 这是Wordounct.py

的内容
import sys,os from pyspark import SparkContext  
sc = SparkContext()
myrdd = sc.textFile("passwd") 
myrdd.count()

当我运行它时,我装入了一个在控制台上显示的错误

以下是ERROR INFO

/usr/local/bin/python3 /home/plters/PycharmProjects/Spark21/Wordcount.py
Traceback (most recent call last):
  File "/home/plters/PycharmProjects/Spark21/Wordcount.py", line 2, in <module>
    from pyspark import SparkContext
  File "/opt/spark2/python/pyspark/__init__.py", line 44, in <module>
    from pyspark.context import SparkContext
  File "/opt/spark2/python/pyspark/context.py", line 29, in <module>
    from py4j.protocol import Py4JError
ImportError: No module named 'py4j

我该怎么办?

1 个答案:

答案 0 :(得分:1)

看起来缺少py4j模块,只需从终端安装

mysql> ALTER USER 'root'@'localhost' IDENTIFIED BY '123456';