Apache Spark安装失败

时间:2015-11-06 10:33:25

标签: scala apache-spark

我正在尝试在Ubuntu上安装Apache Spark独立版,并且在运行“sbt / sbt assembly”命令时,我收到此错误:

java.lang.RuntimeException: Could not create directory /opt/spark-1.5.1/external/zeromq/target/streams/compile/$global/$global/discoveredMainClasses
        at scala.sys.package$.error(package.scala:27)
        at sbt.IO$.createDirectory(IO.scala:166)
        at sbt.IO$.touch(IO.scala:142)
        at sbt.std.Streams$$anon$3$$anon$2.make(Streams.scala:129)
        at sbt.std.Streams$$anon$3$$anon$2.binary(Streams.scala:116)
        at sbt.SessionVar$$anonfun$persist$1.apply(SessionVar.scala:27)
        at sbt.SessionVar$$anonfun$persist$1.apply(SessionVar.scala:26)
        at sbt.std.Streams$class.use(Streams.scala:75)
        at sbt.std.Streams$$anon$3.use(Streams.scala:100)
        at sbt.SessionVar$.persist(SessionVar.scala:26)
        at sbt.SessionVar$.persistAndSet(SessionVar.scala:21)
        at sbt.Project$RichTaskSessionVar$$anonfun$storeAs$1$$anonfun$apply$5.apply(Project.scala:556)
        at sbt.Project$RichTaskSessionVar$$anonfun$storeAs$1$$anonfun$apply$5.apply(Project.scala:556)
        at sbt.SessionVar$$anonfun$1$$anonfun$apply$1.apply(SessionVar.scala:40)
        at sbt.SessionVar$$anonfun$1$$anonfun$apply$1.apply(SessionVar.scala:40)
        at scala.Function$$anonfun$chain$1$$anonfun$apply$1.apply(Function.scala:24)
        at scala.Function$$anonfun$chain$1$$anonfun$apply$1.apply(Function.scala:24)
        at scala.collection.IndexedSeqOptimized$class.foldl(IndexedSeqOptimized.scala:51)
        at scala.collection.IndexedSeqOptimized$class.foldLeft(IndexedSeqOptimized.scala:60)
        at scala.collection.mutable.ArrayBuffer.foldLeft(ArrayBuffer.scala:47)
        at scala.collection.TraversableOnce$class.$div$colon(TraversableOnce.scala:138)
        at scala.collection.AbstractTraversable.$div$colon(Traversable.scala:105)
        at scala.Function$$anonfun$chain$1.apply(Function.scala:24)
        at sbt.EvaluateTask$.applyResults(EvaluateTask.scala:370)
        at sbt.EvaluateTask$.liftedTree1$1(EvaluateTask.scala:344)
        at sbt.EvaluateTask$.run$1(EvaluateTask.scala:341)
        at sbt.EvaluateTask$.runTask(EvaluateTask.scala:361)
        at sbt.Aggregation$$anonfun$3.apply(Aggregation.scala:64)
        at sbt.Aggregation$$anonfun$3.apply(Aggregation.scala:62)
        at sbt.EvaluateTask$.withStreams(EvaluateTask.scala:293)
        at sbt.Aggregation$.timedRun(Aggregation.scala:62)
        at sbt.Aggregation$.runTasks(Aggregation.scala:71)
        at sbt.Aggregation$$anonfun$applyTasks$1.apply(Aggregation.scala:32)
        at sbt.Aggregation$$anonfun$applyTasks$1.apply(Aggregation.scala:31)
        at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:60)
        at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:60)
        at sbt.Aggregation$$anonfun$evaluatingParser$4$$anonfun$apply$5.apply(Aggregation.scala:153)
        at sbt.Aggregation$$anonfun$evaluatingParser$4$$anonfun$apply$5.apply(Aggregation.scala:152)
        at sbt.Act$$anonfun$sbt$Act$$actParser0$1$$anonfun$sbt$Act$$anonfun$$evaluate$1$1$$anonfun$apply$10.apply(Act.scala:244)
        at sbt.Act$$anonfun$sbt$Act$$actParser0$1$$anonfun$sbt$Act$$anonfun$$evaluate$1$1$$anonfun$apply$10.apply(Act.scala:241)
        at sbt.Command$.process(Command.scala:92)
        at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:98)
        at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:98)
        at sbt.State$$anon$1.process(State.scala:184)
        at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:98)
        at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:98)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.MainLoop$.next(MainLoop.scala:98)
        at sbt.MainLoop$.run(MainLoop.scala:91)
        at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:70)
        at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:65)
        at sbt.Using.apply(Using.scala:24)
        at sbt.MainLoop$.runWithNewLog(MainLoop.scala:65)
        at sbt.MainLoop$.runAndClearLast(MainLoop.scala:48)
        at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:32)
        at sbt.MainLoop$.runLogged(MainLoop.scala:24)
        at sbt.StandardMain$.runManaged(Main.scala:53)
        at sbt.xMain.run(Main.scala:28)
        at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:109)
        at xsbt.boot.Launch$.withContextLoader(Launch.scala:128)
        at xsbt.boot.Launch$.run(Launch.scala:109)
        at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:35)
        at xsbt.boot.Launch$.launch(Launch.scala:117)
        at xsbt.boot.Launch$.apply(Launch.scala:18)
        at xsbt.boot.Boot$.runImpl(Boot.scala:41)
        at xsbt.boot.Boot$.main(Boot.scala:17)
        at xsbt.boot.Boot.main(Boot.scala)
[error] Could not create directory /opt/spark-1.5.1/external/zeromq/target/streams/compile/$global/$global/discoveredMainClasses
[error] Use 'last' for the full log.

还有其他人遇到过这个问题吗?

java版“1.8.0_65”

Scala代码运行器版本2.11.7 - 版权所有2002-2013,LAMP / EPFL

3 个答案:

答案 0 :(得分:2)

由于错误提及您没有对/ opt目录的写访问权。

Could not create directory /opt/spark-1.5.1/external/zeromq/target/streams/compile/$global/$global/discoveredMainClasses

您需要root访问才能写入此文件夹。 你可以

  • 将Apache Spark下载并编译到您的主文件夹,然后将其移至/opt
  • 运行sudo sbt/sbt assembly以获得root访问权限,同时构建spark(它被视为以root身份编译时不安全)

答案 1 :(得分:0)

您必须具有root权限才能添加和操作/ opt / files。 Spark配置错误。我建议按照以下步骤安装spark和scala,然后尝试运行sbt。祝一切顺利。 https://www.youtube.com/watch?v=BozSL9ygUto

答案 2 :(得分:0)

我们不一致地得到这个错误。它可能是由内部SBT错误引起的。

" SBT中似乎存在竞争条件,该竞争条件仅由导致多个编译过程并行运行的插件触发。"

有关详细信息,请参阅此处:https://github.com/sbt/sbt/issues/1673

看看你是否可以禁用一些插件并重新运行。