PySpark ClassNotFoundException:org.apache.spark.sql.DataFrame

时间:2019-03-18 02:53:24

标签: python apache-spark pyspark graphlab

我正在遵循此页面上的示例以使用PySpark测试graphlab-create Spark Integration

我尝试了以下链接中的代码:

from pyspark import SparkContext
from pyspark.sql import SQLContext
# Launch spark by creating a spark context
sc = SparkContext()
# Create a SparkSQL context to manage dataframe schema information.
sql = SQLContext(sc)

from graphlab import SFrame
rdd = sc.parallelize([(x,str(x), "hello") for x in range(0,5)])
df = sql.createDataFrame(rdd)
sframe = SFrame.from_rdd(df, sc)
print sframe

以某种方式,在运行sframe = SFrame.from_rdd(df, sc)时,我不断出现以下错误:

Py4JJavaError: An error occurred while calling o72.getDeclaredMethod.
: java.lang.NoClassDefFoundError: org/apache/spark/sql/DataFrame
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2688)
at java.lang.Class.getDeclaredMethod(Class.java:2115)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:280)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.DataFrame
    at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 14 more

我的环境跟随pip冻结 我正在使用Conda创建虚拟环境

appnope==0.1.0
attrs==19.1.0
awscli==1.6.2
backports-abc==0.5
backports.shutil-get-terminal-size==1.0.0
backports.ssl-match-hostname==3.7.0.1
bcdoc==0.12.2
bleach==3.1.0
boto==2.33.0
botocore==0.73.0
certifi==2019.3.9
colorama==0.2.5
configparser==3.7.3
decorator==4.3.2
defusedxml==0.5.0
docutils==0.14
entrypoints==0.3
enum34==1.1.6
findspark==1.3.0
functools32==3.2.3.post2
futures==3.2.0
genson==0.1.0
GraphLab-Create==2.1
GraphLab-Create-License==2.1
ipaddress==1.0.22
ipykernel==4.10.0
ipython==5.8.0
ipython-genutils==0.2.0
ipywidgets==7.4.2
Jinja2==2.10
jmespath==0.5.0
jsonschema==3.0.1
jupyter==1.0.0
jupyter-client==5.2.4
jupyter-console==5.2.0
jupyter-core==4.4.0
MarkupSafe==1.1.1
mistune==0.8.4
multipledispatch==0.6.0
nbconvert==5.4.1
nbformat==4.4.0
notebook==5.7.0
pandocfilters==1.4.2
pathlib2==2.3.3
pexpect==4.6.0
pickleshare==0.7.5
prettytable==0.7.2
prometheus-client==0.6.0
prompt-toolkit==1.0.15
psclient==2.0
ptyprocess==0.6.0
py4j==0.10.4
pyasn1==0.4.5
Pygments==2.3.1
pyrsistent==0.14.11
pyspark==2.1.2
python-dateutil==2.8.0
pyzmq==18.0.0
qtconsole==4.4.3
requests==2.9.1
rsa==3.1.2
scandir==1.10.0
Send2Trash==1.5.0
SFrame==2.1
simplegeneric==0.8.1
singledispatch==3.4.0.3
six==1.12.0
sseclient==0.0.8
terminado==0.8.1
testpath==0.4.2
tornado==5.1.1
traitlets==4.3.2
wcwidth==0.1.7
webencodings==0.5.1
widgetsnbextension==3.4.2

我还尝试通过输入手动添加jar

(gl-env) bash-3.2$ spark-shell --jars /usr/local/Cellar/apache-spark/2.4.0/libexec/jars/spark-sql_2.12-2.4.0.jar

,并且仍然存在相同的错误。

有人知道如何解决此问题吗?

谢谢

0 个答案:

没有答案