升级到2.2.1后,Apache Spark出现Java虚拟机错误

时间:2018-02-13 19:13:10

标签: java hadoop apache-spark jvm pyspark

我将Apache Spark从2.1.1升级到2.2.1(没有进行任何其他更改),但现在我无法正常运行spark-submit或pyspark shell。

如果我尝试使用./bin/run-example SparkPi 10从命令行运行pi计算示例,我会收到以下错误:

Exception in thread "main" java.lang.VerifyError: 
class org.apache.spark.sql.execution.LogicalRDD overrides final method sameResult.
(Lorg/apache/spark/sql/catalyst/plans/QueryPlan;)Z

或者,如果我尝试通过运行./bin/pyspark来使用shell,我得到:

Exception in thread "Thread-2" java.lang.VerifyError: class org.apache.spark.sql.execution.LogicalRDD overrides final method sameResult.(Lorg/apache/spark/sql/catalyst/plans/QueryPlan;)Z

...

ERROR:root:Exception while sending command.
Traceback (most recent call last):
  File "/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 883, in send_command
    response = connection.send_command(command)
  File "/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1040, in send_command
    "Error while receiving", e, proto.ERROR_ON_RECEIVE)
Py4JNetworkError: Error while receiving
Exception in thread "Thread-16" java.lang.VerifyError: class org.apache.spark.sql.execution.LogicalRDD overrides final method sameResult.(Lorg/apache/spark/sql/catalyst/plans/QueryPlan;)Z

...

ERROR:root:Exception while sending command.
Traceback (most recent call last):
  File "/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 883, in send_command
    response = connection.send_command(command)
  File "/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1040, in send_command
    "Error while receiving", e, proto.ERROR_ON_RECEIVE)
Py4JNetworkError: Error while receiving
Traceback (most recent call last):
  File "/usr/local/spark/python/pyspark/shell.py", line 47, in <module>
    spark = SparkSession.builder.getOrCreate()
  File "/usr/local/spark/python/pyspark/sql/session.py", line 177, in getOrCreate
    session = SparkSession(sc)
  File "/usr/local/spark/python/pyspark/sql/session.py", line 211, in __init__
    jsparkSession = self._jvm.SparkSession(self._jsc.sc())
  File "/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1532, in __getattr__
py4j.protocol.Py4JError: SparkSession does not exist in the JVM

我该如何解决这个问题?

0 个答案:

没有答案