一个错误" org.apache.spark.internal.config.package"使用时./pyspark

时间:2017-08-07 13:42:40

标签: apache-spark pyspark

我下载了spart-2.2,但是当我运行它时。它抛出一个错误(info:java.lang.NoClassDefFoundError:无法初始化类org.apache.spark.internal.config.package)。不是Hadoop lib的原因。

我的操作系统是Linux,使用JDK8,并遵循日志:

17/08/07 06:13:29 WARN SparkContext:另一个SparkContext

  正在构造

(或在其构造函数中抛出异常)。这个   可能表示错误,因为只能运行一个SparkContext   这个JVM(见SPARK-2243)。另一个SparkContext创建于:   。org.apache.spark.api.java.JavaSparkContext(JavaSparkContext.scala:58)   sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)   sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)   sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)   java.lang.reflect.Constructor.newInstance(Constructor.java:423)   py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)   py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)   py4j.Gateway.invoke(Gateway.java:236)   py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)   py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)   py4j.GatewayConnection.run(GatewayConnection.java:214)   java.lang.Thread.run(Thread.java:748)Traceback(最近一次调用   最后):文件   " /usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/pyspark/shell.py" ;,   第54行,在       spark = SparkSession.builder.getOrCreate()File" /usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/pyspark/sql/session.py",   第169行,在getOrCreate中       sc = SparkContext.getOrCreate(sparkConf)File" /usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/pyspark/context.py",   第334行,在getOrCreate中       SparkContext(conf = conf或SparkConf())File" /usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/pyspark/context.py",   第118行,在 init 中       conf,jsc,profiler_cls)File" /usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/pyspark/context.py",   第180行,在_do_init中       self._jsc = jsc或self._initialize_context(self._conf._jconf)文件   " /usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/pyspark/context.py" ;,   第273行,在_initialize_context中       return self._jvm.JavaSparkContext(jconf)File" /usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway的.py&#34 ;,   第1401行,在调用文件中   " /usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py" ;,第319行,在get_return_value py4j.protocol.Py4JJavaError:错误   在打电话时发生   None.org.apache.spark.api.java.JavaSparkContext。 :    java.lang.NoClassDefFoundError:无法初始化类   org.apache.spark.internal.config.package $ at   org.apache.spark.SparkConf.validateSettings(SparkConf.scala:546)at at   org.apache.spark.SparkContext。(SparkContext.scala:373)at   。org.apache.spark.api.java.JavaSparkContext(JavaSparkContext.scala:58)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native   方法)at   sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)     在   sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:423)     在py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)at   py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)at   py4j.Gateway.invoke(Gateway.java:236)at   py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)     在   py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)     在py4j.GatewayConnection.run(GatewayConnection.java:214)at   java.lang.Thread.run(Thread.java:748)

0 个答案:

没有答案