我在Ubuntu 12.o4客户端操作系统上安装了Scala,sbt和hadoop 1.0.3。通过链接 - http://docs.sigmoidanalytics.com/index.php/How_to_Install_Spark_on_Ubuntu-12.04的引用,我尝试构建Spark并获得与保留空间相关的错误。
这是我想要运行的内容:
hduser@vignesh-desktop:/usr/local/spark-1.1.0$ SPARK_HADOOP_VERSION=1.1.0 sbt/sbt assembly
输出以下错误:
Using /usr/lib/jvm/java-6-openjdk-i386/ as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
Error occurred during initialization of VM
Could not reserve enough space for object heap
Could not create the Java virtual machine.
答案 0 :(得分:6)
我通过使用sbt命令传递mem属性来解决这个问题,如下所示(对于4 GB RAM系统)
SPARK_HADOOP_VERSION=1.1.0 sbt/sbt assembly -mem 1024