在群集模式下将应用程序从Windows提交到独立群集时,环境变量无效

时间:2020-04-15 03:09:58

标签: windows apache-spark environment

我正在使用spark.2.3.1。从Windows将Spark应用程序以群集模式提交到独立群集spark://10.101.3.128:7077。失败,并显示此错误。

2020-04-15 10:53:09 ERROR ClientEndpoint:70 - Exception from cluster was: java.lang.IllegalArgumentException: Invalid environment variable name: "=D:"
java.lang.IllegalArgumentException: Invalid environment variable name: "=D:"
        at java.lang.ProcessEnvironment.validateVariable(ProcessEnvironment.java:114)
        at java.lang.ProcessEnvironment.access$200(ProcessEnvironment.java:61)
        at java.lang.ProcessEnvironment$Variable.valueOf(ProcessEnvironment.java:170)
        at java.lang.ProcessEnvironment$StringEnvironment.put(ProcessEnvironment.java:242)
        at java.lang.ProcessEnvironment$StringEnvironment.put(ProcessEnvironment.java:221)
        at org.apache.spark.deploy.worker.CommandUtils$$anonfun$buildProcessBuilder$2.apply(CommandUtils.scala:55)
        at org.apache.spark.deploy.worker.CommandUtils$$anonfun$buildProcessBuilder$2.apply(CommandUtils.scala:54)
        at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
        at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:221)
        at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)
        at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
        at org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:54)
        at org.apache.spark.deploy.worker.DriverRunner.prepareAndRunDriver(DriverRunner.scala:182)
        at org.apache.spark.deploy.worker.DriverRunner$$anon$1.run(DriverRunner.scala:92)

我知道JDK(https://bugs.openjdk.java.net/browse/JDK-8187776)中存在JDK-818776问题。 Windows有一些特殊的环境变量。您可以使用命令集“”查看它们。例如:

D:\>set ""
=D:=D:\
=ExitCode=00000000

我想知道此问题是否已在任何发行版的Spark中解决(过滤掉这些变量),或者将来会解决。

顺便说一下,spark://10.101.3.128:6066不存在此问题。

谢谢。

0 个答案:

没有答案