hadoop@youngv-VirtualBox:/usr/local/spark$ ./bin/spark-shell
18/11/30 23:32:38 WARN Utils: Your hostname, youngv-VirtualBox resolves to a loopback address: 127.0.0.1; using 10.0.2.15 instead (on interface enp0s3)
18/11/30 23:32:38 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
18/11/30 23:32:40 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
线程“主”中的异常java.lang.NoSuchMethodError: org.apache.spark.repl.SparkILoop.mumly(Lscala / Function0;)Ljava / lang / Object; 在org.apache.spark.repl.SparkILoop $$ anonfun $ process $ 1.org $ apache $ spark $ repl $ SparkILoop $$ anonfun $$ loopPostInit $ 1(SparkILoop.scala:199) 在org.apache.spark.repl.SparkILoop $$ anonfun $ process $ 1 $$ anonfun $ startup $ 1 $ 1.apply(SparkILoop.scala:267) 在org.apache.spark.repl.SparkILoop $$ anonfun $ process $ 1 $$ anonfun $ startup $ 1 $ 1.apply(SparkILoop.scala:247) 在org.apache.spark.repl.SparkILoop $$ anonfun $ process $ 1.withSuppressedSettings $ 1(SparkILoop.scala:235) 在org.apache.spark.repl.SparkILoop $$ anonfun $ process $ 1.startup $ 1(SparkILoop.scala:247) 在org.apache.spark.repl.SparkILoop $$ anonfun $ process $ 1.apply $ mcZ $ sp(SparkILoop.scala:282) 在org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159) 在org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182) 在org.apache.spark.repl.Main $ .doMain(Main.scala:78) 在org.apache.spark.repl.Main $ .main(Main.scala:58) 在org.apache.spark.repl.Main.main(Main.scala) 在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)
虽然我想运行spark-shell,但出现错误 带有:spark-2.4.0 scala-2.11.12 jdk-1.8 谁能告诉我如何解决这个问题?我将不胜感激。
答案 0 :(得分:0)
程序集类路径中可能有不同的jar版本,请将其删除并尝试 再次构建它。