Zeppelin笔记本在本地运行Spark作业时出错

时间:2019-02-07 22:03:22

标签: scala apache-spark apache-zeppelin

我已经在Mac上安装了spark,并且在终端中运行spark-submit作业或使用spark-shell时,一切运行正常。我还安装了Zeppelin,但是当我尝试在Zeppelin笔记本中运行简单的sc时,出现以下错误。

scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found.
    at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:17)
    at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:18)
    at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:53)
    at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45)
    at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45)
    at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66)
    at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102)
    at scala.reflect.internal.Mirrors$RootsBase.getRequiredClass(Mirrors.scala:105)
    at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257)
    at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257)
    at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1394)
    at scala.tools.nsc.Global$Run.<init>(Global.scala:1215)
    at scala.tools.nsc.interpreter.IMain.compileSourcesKeepingRun(IMain.scala:432)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compileAndSaveRun(IMain.scala:855)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compile(IMain.scala:813)
    at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:675)
    at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:712)
    at scala.tools.nsc.interpreter.IMain$$anonfun$quietBind$1.apply(IMain.scala:711)
    at scala.tools.nsc.interpreter.IMain$$anonfun$quietBind$1.apply(IMain.scala:711)
    at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
    at scala.tools.nsc.interpreter.IMain.quietBind(IMain.scala:711)
    at scala.tools.nsc.interpreter.ILoop.scala$tools$nsc$interpreter$ILoop$$loopPostInit(ILoop.scala:891)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:566)
    at org.apache.zeppelin.spark.BaseSparkScalaInterpreter.callMethod(BaseSparkScalaInterpreter.scala:270)
    at org.apache.zeppelin.spark.BaseSparkScalaInterpreter.callMethod(BaseSparkScalaInterpreter.scala:262)
    at org.apache.zeppelin.spark.SparkScala211Interpreter.open(SparkScala211Interpreter.scala:84)
    at org.apache.zeppelin.spark.NewSparkInterpreter.open(NewSparkInterpreter.java:102)
    at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:62)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:617)
    at org.apache.zeppelin.scheduler.Job.run(Job.java:188)
    at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:140)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)

版本:
-齐柏林飞艇:0.8.0
-Scala:2.12.8
-星火:2.3.2
-Java:11.0.2

我包含在zeppelin-env.sh中的3件事:
-导出PYTHONPATH = / usr / bin / python
-导出SPARK_HOME = / usr / local / Cellar / apache-spark / 2.3.2 / libexec
-导出HADOOP_CONF_DIR = / usr / local / bin / hadoop

有人知道这里可能缺少什么吗?

2 个答案:

答案 0 :(得分:0)

请检查火花起始路径是否正确。还可以尝试在Zeppelin Web控制台上将Spark解释器设置为本地

答案 1 :(得分:0)

检查您的JAVA_HOME是否已设置。