无法在Windows中启动spark-shell

时间:2018-04-08 05:18:29

标签: apache-spark

我是新来的火花并且无法通过键入spark-shell在命令提示符中启动它,因为它给了我以下几行:

Exception in thread "main" java.lang.ExceptionInInitializerError
        at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
        at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
        at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:273)
        at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:261)
        at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:791)
        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
        at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2464)
        at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2464)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2464)
        at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:222)
        at org.apache.spark.deploy.SparkSubmit$.secMgr$lzycompute$1(SparkSubmit.scala:393)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$secMgr$1(SparkSubmit.scala:393)
        at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:401)
        at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:401)
        at scala.Option.map(Option.scala:146)
        at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:400)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:170)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
        at java.base/java.lang.String.checkBoundsBeginEnd(Unknown Source)
        at java.base/java.lang.String.substring(Unknown Source)
        at org.apache.hadoop.util.Shell.<clinit>(Shell.java:52)
        ... 21 more

请帮忙。

P.S。遵循本指南https://www.youtube.com/watch?v=WlE7RNdtfwE,唯一的变化是我获得了所有最新版本的软件。

1 个答案:

答案 0 :(得分:3)

尽量不要使用所有最新版本,使用最稳定的版本

我建议你使用

  1. Java/JDK -> 1.8
  2. Scala -> 2.11.8
  3. Apache Spark -> Spark-2.3.0 Hadoop-2.7
  4. 我希望这也能解决你的问题。