我尝试使用命令scala nameJar.jar
My Spark配置:
val sc = SparkSession.builder()
.master("local")
.config("spark.driver.extraJavaOptions", "-XX:+UseG3GC")
.config("spark.executor.extraJavaOptions", "-XX:+UseG4GC")
.config("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
.config("spark.kryoserializer.buffer.max","1048")
.config("spark.driver.memory","2048")
.appName("Lea")
.getOrCreate()
错误:
17/06/13 09:35:29 INFO SparkEnv: Registering MapOutputTracker
17/06/13 09:35:29 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: System memory 239075328 must be at least 471
859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.
at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:216)
答案 0 :(得分:1)
我认为这条线很清楚
java.lang.IllegalArgumentException:系统内存239075328必须为 至少471859200.请使用--driver-memory增加堆大小 Spark配置中的option或spark.driver.memory。
您需要增加驱动程序内存
运行时--driver-memory 1g
如果您正在使用maven那么
export MAVEN_OPTS="-Xms1024m -Xmx4096m -XX:PermSize=1024m"
或者您可以将intellij和eclipse中的VM选项参数传递为
-Xms1024m -Xmx4096m -XX:PermSize=1024m
希望这有帮助!