Databricks上的Apache Spark原因:scala.reflect.internal.MissingRequirementError:在编译器镜像中找不到对象java.lang.Object

时间:2020-07-04 11:27:41

标签: apache-spark azure-databricks

每当我尝试执行任何命令(例如%fs rm -r / mnt / driver-daemon / jars /)时,在Databricks Community Edition上的Apache Spark上,都会出现以下错误:

java.lang.Exception:初始化REPL时发生错误。请检查是否有冲突的Scala库或附加到群集的JAR,例如附加到Scala 2.10群集的Scala 2.11库(反之亦然)。

当我查看错误日志时,我发现问题是由以下原因引起的:

原因:scala.reflect.internal.MissingRequirementError:在编译器镜像中找不到对象java.lang.Object。

完整错误如下:

    at com.databricks.backend.daemon.driver.DriverILoop.initSpark(DriverILoop.scala:60)
    at com.databricks.backend.daemon.driver.DriverILoop.initializeSpark(DriverILoop.scala:185)
    at com.databricks.backend.daemon.driver.DriverILoop.createInterpreterForWeb(DriverILoop.scala:165)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal.createInterpreter(ScalaDriverLocal.scala:417)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal.interp(ScalaDriverLocal.scala:434)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply$mcV$sp(ScalaDriverLocal.scala:202)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
    at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:714)
    at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:667)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:202)
    at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:396)
    at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:373)
    at com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:238)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
    at com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:233)
    at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:49)
    at com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:275)
    at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:49)
    at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:373)
    at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
    at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
    at scala.util.Try$.apply(Try.scala:192)
    at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:639)
    at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:485)
    at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:597)
    at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:390)
    at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)
    at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)
    at java.lang.Thread.run(Thread.java:748)
Caused by: scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found.
    at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:17)
    at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:18)
    at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:53)
    at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45)
    at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45)
    at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66)
    at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102)
    at scala.reflect.internal.Mirrors$RootsBase.getRequiredClass(Mirrors.scala:105)
    at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257)
    at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257)
    at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1390)
    at scala.tools.nsc.Global$Run.<init>(Global.scala:1242)
    at scala.tools.nsc.interpreter.IMain.compileSourcesKeepingRun(IMain.scala:439)
    at scala.tools.nsc.interpreter.DriverIMain.compileSourcesKeepingRun(DriverIMain.scala:305)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compileAndSaveRun(IMain.scala:862)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compile(IMain.scala:820)
    at scala.tools.nsc.interpreter.DriverIMain.bind(DriverIMain.scala:84)
    at com.databricks.backend.daemon.driver.DriverILoop.bind(DriverILoop.scala:191)
    at com.databricks.backend.daemon.driver.DatabricksILoop$class.initSpark(DatabricksILoop.scala:87)
    at com.databricks.backend.daemon.driver.DriverILoop.initSpark(DriverILoop.scala:60)
    at com.databricks.backend.daemon.driver.DriverILoop.initializeSpark(DriverILoop.scala:185)
    at com.databricks.backend.daemon.driver.DriverILoop.createInterpreterForWeb(DriverILoop.scala:165)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal.createInterpreter(ScalaDriverLocal.scala:417)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal.interp(ScalaDriverLocal.scala:434)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply$mcV$sp(ScalaDriverLocal.scala:202)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
    at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:714)
    at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:667)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:202)
    at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:396)
    at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:373)
    at com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:238)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
    at com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:233)
    at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:49)
    at com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:275)
    at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:49)
    at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:373)
    at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
    at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
    at scala.util.Try$.apply(Try.scala:192)
    at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:639)
    at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:485)
    at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:597)
    at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:390)
    at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)
    at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)
    at java.lang.Thread.run(Thread.java:748)```

Any thoughts on how to go about resolving this issue?

1 个答案:

答案 0 :(得分:0)

在命令提示符/ Linux终端中执行java -version。如果要在intellij或任何其他IDE中执行代码,请检查项目设置和使用的SDK。如果要提交给Spark Master,请使用同一命令检查Spark Master中使用的Java版本。

Java版本8与scala 2.11 / 2.12 / 2.13兼容