spark-shell - 无法初始化终端(Windows 10,官方cmd.exe)

时间:2016-05-19 14:11:10

标签: apache-spark cmd windows-10

我正试图在官方微软终端cmd.exe上使用hadoop在Windows 10上运行Spark。

我对Hadoop没有问题。安装和说明都没问题。

我正在使用Java 8 x64(jdk1.8.0_92)

当我使用命令spark-shell启动Spark时,我得到了Java错误:

[ERROR] Terminal initialization failed; falling back to unsupported java.lang.NoClassDefFoundError: Could not initialize class scala.tools.fusesource_embedded.jansi.internal.Kernel32 at scala.tools.fusesource_embedded.jansi.internal.WindowsSupport.getConsoleMode(WindowsSupport.java:50) at scala.tools.jline_embedded.WindowsTerminal.getConsoleMode(WindowsTerminal.java:204) at scala.tools.jline_embedded.WindowsTerminal.init(WindowsTerminal.java:82) at scala.tools.jline_embedded.TerminalFactory.create(TerminalFactory.java:101) at scala.tools.jline_embedded.TerminalFactory.get(TerminalFactory.java:158) at scala.tools.jline_embedded.console.ConsoleReader.(ConsoleReader.java:229) at scala.tools.jline_embedded.console.ConsoleReader.(ConsoleReader.java:221) at scala.tools.jline_embedded.console.ConsoleReader.(ConsoleReader.java:209) at scala.tools.nsc.interpreter.jline_embedded.JLineConsoleReader.(JLineReader.scala:61) at scala.tools.nsc.interpreter.jline_embedded.InteractiveReader.(JLineReader.scala:33) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiate$1$1.apply(ILoop.scala:865) at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiate$1$1.apply(ILoop.scala:862) at scala.tools.nsc.interpreter.ILoop.scala$tools$nsc$interpreter$ILoop$$mkReader$1(ILoop.scala:871) at scala.tools.nsc.interpreter.ILoop$$anonfun$15$$anonfun$apply$8.apply(ILoop.scala:875) at scala.tools.nsc.interpreter.ILoop$$anonfun$15$$anonfun$apply$8.apply(ILoop.scala:875) at scala.util.Try$.apply(Try.scala:192) at scala.tools.nsc.interpreter.ILoop$$anonfun$15.apply(ILoop.scala:875) at scala.tools.nsc.interpreter.ILoop$$anonfun$15.apply(ILoop.scala:875) at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418) at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418) at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1233) at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1223) at scala.collection.immutable.Stream.collect(Stream.scala:435) at scala.tools.nsc.interpreter.ILoop.chooseReader(ILoop.scala:877) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$2.apply(ILoop.scala:916) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:916) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911) at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97) at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:911) at org.apache.spark.repl.Main$.main(Main.scala:49) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

1 个答案:

答案 0 :(得分:0)

在Netbeans中打开scala控制台时,我获得完全相同的堆栈跟踪。如果我在堆栈跟踪下面键入任何scala表达式,它可以正常工作。