内存不足错误构建spark时出错

时间:2016-06-18 05:47:56

标签: scala apache-spark sbt

我正在使用sbt建立火花。当我运行以下命令时:

sbt/sbt assembly

需要一些时间来建立火花。出现了几个警告,最后我得到了以下错误:

[error] java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space
[error] Use 'last' for the full log.

当我使用命令 sbt sbtVersion 检查sbt版本时,我得到以下结果:

[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn]  * com.typesafe.sbt:sbt-git:0.6.1 -> 0.6.2
[warn]  * com.typesafe.sbt:sbt-site:0.7.0 -> 0.7.1
.......
[info] streaming-zeromq/*:sbtVersion
[info]  0.13.7
[info] repl/*:sbtVersion
[info]  0.13.7
[info] spark/*:sbtVersion
[info]  0.13.7

当我发出命令 ./ bin / spark-shell 时,我得到以下输出:

ls: cannot access '/home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10': No such file or directory
Failed to find Spark assembly in /home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10.
You need to build Spark before running this program.

解决方案可以是什么?

2 个答案:

答案 0 :(得分:9)

您必须配置SBT堆大小:

  • 在linux类型export SBT_OPTS="-Xmx2G"上设置为临时
  • 在Linux上
  • ,您可以修改~/.bash_profile并添加行export SBT_OPTS="-Xmx2G"
  • 在窗口类型set JAVA_OPTS=-Xmx2G
  • 将其设置为临时
  • 在Windows上,您可以修改sbt\conf\sbtconfig.txt并设置-Xmx2G

更多信息:

http://www.scala-sbt.org/0.13.1/docs/Getting-Started/Setup.html

How to set heap size for sbt?

答案 1 :(得分:0)

这可能不是常见的解决方案,但在我的情况下,我必须运行此命令来解决使用 sbt 构建 spark 项目时出现的 OutOfMemoryError(路径特定于 mac OS):

rm -rf /Users/markus.braasch/Library/Caches/Coursier/v1/https/

在 SBT_OPTS 中为各种参数增加内存设置并没有解决问题。