我遵循these instructions,并在机器上安装了Apache Spark(PySpark)2.3.1,其规格如下:
当我通过从外壳程序调用{{1}}间接创建SparkSession
或通过以下方式直接在我的应用中创建会话时
pyspark
我收到以下异常:
spark = pyspark.sql.SparkSession.builder.appName('test').getOrCreate()
如果我使用的是Jupyter笔记本,则笔记本中也会出现此异常:
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:273)
....
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
at java.base/java.lang.String.checkBoundsBeginEnd(String.java:3107)
at java.base/java.lang.String.substring(String.java:1873)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:52)
... 22 more
Traceback (most recent call last):
File "/home/welshamy/tools/anaconda3/lib/python3.6/site-packages/pyspark/python/pyspark/shell.py", line 38, in <module>
SparkContext._ensure_initialized()
File "/home/welshamy/tools/anaconda3/lib/python3.6/site-packages/pyspark/context.py", line 292, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
File "/home/welshamy/tools/anaconda3/lib/python3.6/site-packages/pyspark/java_gateway.py", line 93, in launch_gateway
raise Exception("Java gateway process exited before sending its port number")
Exception: Java gateway process exited before sending its port number
答案 0 :(得分:1)
PySpark 2.3.1不支持JDK 10+。您需要安装JDK 8 ,并设置Exception: Java gateway process exited before sending the driver its port number
环境变量以指向它。
如果您使用的是Ubuntu(或* nix):
安装JDK 8
JAVA_HOME
将以下行添加到您的sudo apt-get install openjdk-8-jdk
文件中:
~/.bashrc
在Windows中,install JDK 8和set JAVA_HOME
。