我正在运行一些在我的集群中连接Spark和Hive的SQL命令,但是在中途遇到了这个错误。知道如何解决这个问题吗?
java.lang.OutOfMemoryError: PermGen space
Stopping spark context.
Exception in thread "main"
Exception: java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in thread "main"
答案 0 :(得分:0)
You need to add -XX:MaxPermSize=1024m -XX:PermSize=256m in the spark.driver.extraJavaOptions like below:
./bin/spark-shell --master spark://servername:7077 --driver-class-path
$CLASSPATH --conf "spark.driver.extraJavaOptions=-XX:MaxPermSize=1024m -XX:PermSize=256m"