我正在使用sbt建立火花。当我运行以下命令时:
sbt/sbt assembly
需要一些时间来建立火花。出现了几个警告,最后我得到了以下错误:
[error] java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space
[error] Use 'last' for the full log.
当我使用命令 sbt sbtVersion 检查sbt版本时,我得到以下结果:
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * com.typesafe.sbt:sbt-git:0.6.1 -> 0.6.2
[warn] * com.typesafe.sbt:sbt-site:0.7.0 -> 0.7.1
.......
[info] streaming-zeromq/*:sbtVersion
[info] 0.13.7
[info] repl/*:sbtVersion
[info] 0.13.7
[info] spark/*:sbtVersion
[info] 0.13.7
当我发出命令 ./ bin / spark-shell 时,我得到以下输出:
ls: cannot access '/home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10': No such file or directory
Failed to find Spark assembly in /home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10.
You need to build Spark before running this program.
解决方案可以是什么?
答案 0 :(得分:9)
您必须配置SBT堆大小:
export SBT_OPTS="-Xmx2G"
上设置为临时~/.bash_profile
并添加行export SBT_OPTS="-Xmx2G"
set JAVA_OPTS=-Xmx2G
上sbt\conf\sbtconfig.txt
并设置-Xmx2G
更多信息:
http://www.scala-sbt.org/0.13.1/docs/Getting-Started/Setup.html
答案 1 :(得分:0)
这可能不是常见的解决方案,但在我的情况下,我必须运行此命令来解决使用 sbt 构建 spark 项目时出现的 OutOfMemoryError(路径特定于 mac OS):
rm -rf /Users/markus.braasch/Library/Caches/Coursier/v1/https/
在 SBT_OPTS 中为各种参数增加内存设置并没有解决问题。